Dec 09 14:58:40 crc systemd[1]: Starting Kubernetes Kubelet... Dec 09 14:58:40 crc restorecon[4629]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:40 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:41 crc restorecon[4629]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 09 14:58:41 crc restorecon[4629]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 09 14:58:41 crc kubenswrapper[4735]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 14:58:41 crc kubenswrapper[4735]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 09 14:58:41 crc kubenswrapper[4735]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 14:58:41 crc kubenswrapper[4735]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 14:58:41 crc kubenswrapper[4735]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 09 14:58:41 crc kubenswrapper[4735]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.261652 4735 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266897 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266920 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266925 4735 feature_gate.go:330] unrecognized feature gate: Example Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266929 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266935 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266940 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266944 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266949 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266954 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266959 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266964 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266968 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266971 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266975 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266980 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266985 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.266999 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267003 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267006 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267010 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267014 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267018 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267021 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267025 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267029 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267032 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267036 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267039 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267044 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267048 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267052 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267057 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267061 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267074 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267078 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267082 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267085 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267089 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267092 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267098 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267101 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267105 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267109 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267113 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267116 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267120 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267123 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267126 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267129 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267132 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267136 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267139 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267143 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267147 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267151 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267155 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267160 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267165 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267168 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267172 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267176 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267180 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267184 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267187 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267189 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267193 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267196 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267200 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267203 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267208 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.267212 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267921 4735 flags.go:64] FLAG: --address="0.0.0.0" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267936 4735 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267948 4735 flags.go:64] FLAG: --anonymous-auth="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267953 4735 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267958 4735 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267962 4735 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267967 4735 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267972 4735 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267976 4735 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267980 4735 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267985 4735 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267991 4735 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.267997 4735 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268002 4735 flags.go:64] FLAG: --cgroup-root="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268006 4735 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268011 4735 flags.go:64] FLAG: --client-ca-file="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268017 4735 flags.go:64] FLAG: --cloud-config="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268021 4735 flags.go:64] FLAG: --cloud-provider="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268025 4735 flags.go:64] FLAG: --cluster-dns="[]" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268031 4735 flags.go:64] FLAG: --cluster-domain="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268036 4735 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268040 4735 flags.go:64] FLAG: --config-dir="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268044 4735 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268048 4735 flags.go:64] FLAG: --container-log-max-files="5" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268054 4735 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268059 4735 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268063 4735 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268070 4735 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268074 4735 flags.go:64] FLAG: --contention-profiling="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268078 4735 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268082 4735 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268087 4735 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268091 4735 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268104 4735 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268108 4735 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268112 4735 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268116 4735 flags.go:64] FLAG: --enable-load-reader="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268120 4735 flags.go:64] FLAG: --enable-server="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268124 4735 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268131 4735 flags.go:64] FLAG: --event-burst="100" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268136 4735 flags.go:64] FLAG: --event-qps="50" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268140 4735 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268145 4735 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268148 4735 flags.go:64] FLAG: --eviction-hard="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268153 4735 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268157 4735 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268162 4735 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268166 4735 flags.go:64] FLAG: --eviction-soft="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268170 4735 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268174 4735 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268178 4735 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268182 4735 flags.go:64] FLAG: --experimental-mounter-path="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268186 4735 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268190 4735 flags.go:64] FLAG: --fail-swap-on="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268194 4735 flags.go:64] FLAG: --feature-gates="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268198 4735 flags.go:64] FLAG: --file-check-frequency="20s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268202 4735 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268206 4735 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268211 4735 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268218 4735 flags.go:64] FLAG: --healthz-port="10248" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268222 4735 flags.go:64] FLAG: --help="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268226 4735 flags.go:64] FLAG: --hostname-override="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268230 4735 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268234 4735 flags.go:64] FLAG: --http-check-frequency="20s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268238 4735 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268243 4735 flags.go:64] FLAG: --image-credential-provider-config="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268246 4735 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268250 4735 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268254 4735 flags.go:64] FLAG: --image-service-endpoint="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268258 4735 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268262 4735 flags.go:64] FLAG: --kube-api-burst="100" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268266 4735 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268270 4735 flags.go:64] FLAG: --kube-api-qps="50" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268274 4735 flags.go:64] FLAG: --kube-reserved="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268278 4735 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268281 4735 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268285 4735 flags.go:64] FLAG: --kubelet-cgroups="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268289 4735 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268293 4735 flags.go:64] FLAG: --lock-file="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268296 4735 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268300 4735 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268304 4735 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268310 4735 flags.go:64] FLAG: --log-json-split-stream="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268314 4735 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268318 4735 flags.go:64] FLAG: --log-text-split-stream="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268322 4735 flags.go:64] FLAG: --logging-format="text" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268325 4735 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268329 4735 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268333 4735 flags.go:64] FLAG: --manifest-url="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268337 4735 flags.go:64] FLAG: --manifest-url-header="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268341 4735 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268348 4735 flags.go:64] FLAG: --max-open-files="1000000" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268352 4735 flags.go:64] FLAG: --max-pods="110" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268356 4735 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268360 4735 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268364 4735 flags.go:64] FLAG: --memory-manager-policy="None" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268368 4735 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268373 4735 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268378 4735 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268382 4735 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268393 4735 flags.go:64] FLAG: --node-status-max-images="50" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268398 4735 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268402 4735 flags.go:64] FLAG: --oom-score-adj="-999" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268407 4735 flags.go:64] FLAG: --pod-cidr="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268412 4735 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268421 4735 flags.go:64] FLAG: --pod-manifest-path="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268425 4735 flags.go:64] FLAG: --pod-max-pids="-1" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268429 4735 flags.go:64] FLAG: --pods-per-core="0" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268433 4735 flags.go:64] FLAG: --port="10250" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268437 4735 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268441 4735 flags.go:64] FLAG: --provider-id="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268445 4735 flags.go:64] FLAG: --qos-reserved="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268448 4735 flags.go:64] FLAG: --read-only-port="10255" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268452 4735 flags.go:64] FLAG: --register-node="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268456 4735 flags.go:64] FLAG: --register-schedulable="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268461 4735 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268467 4735 flags.go:64] FLAG: --registry-burst="10" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268471 4735 flags.go:64] FLAG: --registry-qps="5" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268475 4735 flags.go:64] FLAG: --reserved-cpus="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268481 4735 flags.go:64] FLAG: --reserved-memory="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268487 4735 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268491 4735 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268495 4735 flags.go:64] FLAG: --rotate-certificates="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268501 4735 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268506 4735 flags.go:64] FLAG: --runonce="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268527 4735 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268532 4735 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268536 4735 flags.go:64] FLAG: --seccomp-default="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268540 4735 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268544 4735 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268549 4735 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268553 4735 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268557 4735 flags.go:64] FLAG: --storage-driver-password="root" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268563 4735 flags.go:64] FLAG: --storage-driver-secure="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268567 4735 flags.go:64] FLAG: --storage-driver-table="stats" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268572 4735 flags.go:64] FLAG: --storage-driver-user="root" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268576 4735 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268586 4735 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268591 4735 flags.go:64] FLAG: --system-cgroups="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268594 4735 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268600 4735 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268604 4735 flags.go:64] FLAG: --tls-cert-file="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268609 4735 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268613 4735 flags.go:64] FLAG: --tls-min-version="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268617 4735 flags.go:64] FLAG: --tls-private-key-file="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268621 4735 flags.go:64] FLAG: --topology-manager-policy="none" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268625 4735 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268629 4735 flags.go:64] FLAG: --topology-manager-scope="container" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268633 4735 flags.go:64] FLAG: --v="2" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268638 4735 flags.go:64] FLAG: --version="false" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268644 4735 flags.go:64] FLAG: --vmodule="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268648 4735 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.268652 4735 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268747 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268752 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268757 4735 feature_gate.go:330] unrecognized feature gate: Example Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268760 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268764 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268779 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268783 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268786 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268789 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268793 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268796 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268799 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268803 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268806 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268810 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268813 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268818 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268822 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268825 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268828 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268831 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268835 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268838 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268841 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268846 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268850 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268854 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268858 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268862 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268866 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268870 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268873 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268877 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268881 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268885 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268889 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268893 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268898 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268903 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268907 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268911 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268914 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268918 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268922 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268926 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268929 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268933 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268936 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268941 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268945 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268949 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268953 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268956 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268961 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268965 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268969 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268973 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268976 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268980 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268984 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268989 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268993 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.268998 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.269002 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.269005 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.269009 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.269014 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.269017 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.269022 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.269026 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.269030 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.269701 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.277050 4735 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.277076 4735 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277127 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277134 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277139 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277144 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277148 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277153 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277158 4735 feature_gate.go:330] unrecognized feature gate: Example Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277162 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277165 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277169 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277173 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277177 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277180 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277183 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277188 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277193 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277198 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277202 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277207 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277211 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277215 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277218 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277222 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277226 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277230 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277234 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277238 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277241 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277245 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277249 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277254 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277259 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277264 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277270 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277276 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277282 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277287 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277293 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277299 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277304 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277308 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277311 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277316 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277320 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277323 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277327 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277331 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277335 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277338 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277341 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277345 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277348 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277352 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277356 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277359 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277364 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277367 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277371 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277375 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277378 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277382 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277385 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277388 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277391 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277395 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277398 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277402 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277405 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277408 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277411 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277415 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.277420 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277554 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277562 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277566 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277570 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277573 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277576 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277579 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277582 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277585 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277590 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277593 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277596 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277599 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277602 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277606 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277609 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277613 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277616 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277620 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277623 4735 feature_gate.go:330] unrecognized feature gate: Example Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277627 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277630 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277633 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277637 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277641 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277646 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277650 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277654 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277657 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277662 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277666 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277670 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277674 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277679 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277684 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277688 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277692 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277695 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277699 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277702 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277705 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277708 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277712 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277715 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277719 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277724 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277727 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277731 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277735 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277739 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277742 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277746 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277750 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277753 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277757 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277762 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277771 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277775 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277778 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277781 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277784 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277788 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277791 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277794 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277797 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277800 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277803 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277807 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277810 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277813 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.277817 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.277821 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.277925 4735 server.go:940] "Client rotation is on, will bootstrap in background" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.280555 4735 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.280635 4735 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.281579 4735 server.go:997] "Starting client certificate rotation" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.281606 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.281793 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-30 12:53:47.157532013 +0000 UTC Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.281924 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 501h55m5.875609811s for next certificate rotation Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.294172 4735 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.296177 4735 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.310125 4735 log.go:25] "Validated CRI v1 runtime API" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.329924 4735 log.go:25] "Validated CRI v1 image API" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.332808 4735 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.336646 4735 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-09-14-55-08-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.336682 4735 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.352605 4735 manager.go:217] Machine: {Timestamp:2025-12-09 14:58:41.350862829 +0000 UTC m=+0.275701457 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445404 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea BootID:2ea56a57-18d8-4f3a-8391-c192c3891ec8 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108169 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d9:28:6e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:d9:28:6e Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:b8:68:93 Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:42:c9:bf Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:4d:d1:36 Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:e8:84:c9 Speed:-1 Mtu:1436} {Name:enp7s0.23 MacAddress:52:54:00:7d:2c:da Speed:-1 Mtu:1436} {Name:eth10 MacAddress:ca:ad:01:97:46:38 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:32:96:a5:9a:8c:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.352862 4735 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.352969 4735 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.353245 4735 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.353454 4735 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.353484 4735 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.353708 4735 topology_manager.go:138] "Creating topology manager with none policy" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.353718 4735 container_manager_linux.go:303] "Creating device plugin manager" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.353999 4735 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.354028 4735 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.354496 4735 state_mem.go:36] "Initialized new in-memory state store" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.354852 4735 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.356877 4735 kubelet.go:418] "Attempting to sync node with API server" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.356898 4735 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.356920 4735 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.356932 4735 kubelet.go:324] "Adding apiserver pod source" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.356947 4735 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.359155 4735 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.360011 4735 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.361417 4735 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.361554 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.361669 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.226:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.361551 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.361772 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.226:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362453 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362482 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362490 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362500 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362529 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362537 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362544 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362555 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362563 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362572 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362599 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.362608 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.363187 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.363699 4735 server.go:1280] "Started kubelet" Dec 09 14:58:41 crc systemd[1]: Started Kubernetes Kubelet. Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.365025 4735 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.365007 4735 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.368996 4735 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.369618 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.371402 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.25.226:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187f94036b9a9e98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 14:58:41.363631768 +0000 UTC m=+0.288470396,LastTimestamp:2025-12-09 14:58:41.363631768 +0000 UTC m=+0.288470396,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.372964 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.373001 4735 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.373368 4735 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.373402 4735 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.373352 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 20:13:30.165821316 +0000 UTC Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.373431 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 605h14m48.792392684s for next certificate rotation Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.373614 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.373623 4735 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.373654 4735 server.go:460] "Adding debug handlers to kubelet server" Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.374335 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.374408 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.226:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.374112 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" interval="200ms" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.374893 4735 factory.go:55] Registering systemd factory Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.374918 4735 factory.go:221] Registration of the systemd container factory successfully Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.375216 4735 factory.go:153] Registering CRI-O factory Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.375301 4735 factory.go:221] Registration of the crio container factory successfully Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.375800 4735 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.375875 4735 factory.go:103] Registering Raw factory Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.375934 4735 manager.go:1196] Started watching for new ooms in manager Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.378673 4735 manager.go:319] Starting recovery of all containers Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382772 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382826 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382838 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382848 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382857 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382866 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382874 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382883 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382894 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382903 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382911 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382921 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382942 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382959 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382984 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.382993 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383003 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383012 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383021 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383028 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383037 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383044 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383056 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383064 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383074 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383083 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383093 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383103 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383111 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383119 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383134 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383159 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383173 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383182 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383189 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383198 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383206 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383215 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383223 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383231 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383244 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383252 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383261 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383269 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383277 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383286 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383295 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383303 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383313 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383322 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383330 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383339 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383362 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383372 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.383383 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384146 4735 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384177 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384194 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384205 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384258 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384270 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384280 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384289 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384306 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384323 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384341 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384364 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384374 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384385 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384394 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384408 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384426 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384434 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384442 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384451 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384466 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384480 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384490 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384499 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384525 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384546 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384557 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384566 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384576 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384587 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384602 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384621 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384634 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384648 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384660 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384670 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384687 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384702 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384724 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384734 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384745 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384759 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384769 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384780 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384795 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384810 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384829 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384848 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384861 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384870 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384894 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384909 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384919 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384930 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384940 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384951 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384966 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384984 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.384994 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385003 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385013 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385024 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385044 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385052 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385066 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385078 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385087 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385102 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385112 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385127 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385136 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385144 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385153 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385164 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385177 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385188 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385196 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385208 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385220 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385228 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385241 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385249 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385258 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385266 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385274 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385289 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385306 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385323 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385337 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385348 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385356 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385364 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385381 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385390 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385398 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385412 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385420 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385672 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385687 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385705 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385723 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.385737 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387545 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387627 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387644 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387670 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387682 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387700 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387718 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387729 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387744 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387757 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387769 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387784 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387795 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387810 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387820 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387834 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387849 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387879 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387894 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387908 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387923 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387943 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387955 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387975 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.387991 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388005 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388025 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388040 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388057 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388069 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388082 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388099 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388111 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388128 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388139 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388149 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388164 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388176 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388190 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388206 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388218 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388234 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388243 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388259 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388273 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388290 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388306 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388319 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388333 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388350 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388361 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388372 4735 reconstruct.go:97] "Volume reconstruction finished" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.388378 4735 reconciler.go:26] "Reconciler: start to sync state" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.399568 4735 manager.go:324] Recovery completed Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.407902 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.409057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.409086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.409096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.409650 4735 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.409664 4735 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.409688 4735 state_mem.go:36] "Initialized new in-memory state store" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.411157 4735 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.412764 4735 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.412799 4735 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.412827 4735 kubelet.go:2335] "Starting kubelet main sync loop" Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.412862 4735 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.413367 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.413406 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.226:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.414341 4735 policy_none.go:49] "None policy: Start" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.415365 4735 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.415396 4735 state_mem.go:35] "Initializing new in-memory state store" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.461372 4735 manager.go:334] "Starting Device Plugin manager" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.461411 4735 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.461442 4735 server.go:79] "Starting device plugin registration server" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.461771 4735 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.461795 4735 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.462049 4735 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.462169 4735 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.462182 4735 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.470222 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.513793 4735 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.513892 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.514883 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.514919 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.514932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.515063 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.515371 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.515426 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.515842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.515886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.515899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.516160 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.516231 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.516283 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.516572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.516622 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.516637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.517168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.517198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.517209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.517434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.517463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.517482 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.517640 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.517796 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.517834 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.518361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.518396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.518411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.518663 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.518825 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.518867 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.519111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.519140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.519150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.519427 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.519455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.519467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.519674 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.519702 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.520376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.520413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.520441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.521197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.521227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.521240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.561924 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.562762 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.562791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.562802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.562826 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.563214 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.226:6443: connect: connection refused" node="crc" Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.574859 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" interval="400ms" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.590556 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.590622 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.590661 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.590684 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.590708 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.590773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.590822 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.590850 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.590880 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.590940 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.590984 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.591013 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.591069 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.591100 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.591138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692114 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692154 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692174 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692192 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692244 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692260 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692295 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692310 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692332 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692319 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692347 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692350 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692366 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692411 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692439 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692487 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692502 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692476 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692478 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692410 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692579 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692545 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692619 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692503 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692452 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692455 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.692712 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.764355 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.765408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.765437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.765448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.765472 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.765879 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.226:6443: connect: connection refused" node="crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.840000 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.858125 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.862542 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-d73e5af88f62b62b0b7d9d0679692dc3b3a53b8f9332550675b0c7b77d987234 WatchSource:0}: Error finding container d73e5af88f62b62b0b7d9d0679692dc3b3a53b8f9332550675b0c7b77d987234: Status 404 returned error can't find the container with id d73e5af88f62b62b0b7d9d0679692dc3b3a53b8f9332550675b0c7b77d987234 Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.865100 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.876621 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-728cd348c564af0c1ecb5a9e642b6ff69732b6d9a9cdffed3a0e9961ceaace78 WatchSource:0}: Error finding container 728cd348c564af0c1ecb5a9e642b6ff69732b6d9a9cdffed3a0e9961ceaace78: Status 404 returned error can't find the container with id 728cd348c564af0c1ecb5a9e642b6ff69732b6d9a9cdffed3a0e9961ceaace78 Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.877886 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-059df7a26029fe5f0551f172d1dccc126f3a271d1e82d36a71ba45b8cca8e0af WatchSource:0}: Error finding container 059df7a26029fe5f0551f172d1dccc126f3a271d1e82d36a71ba45b8cca8e0af: Status 404 returned error can't find the container with id 059df7a26029fe5f0551f172d1dccc126f3a271d1e82d36a71ba45b8cca8e0af Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.878322 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: I1209 14:58:41.885001 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.891433 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-8990d760b0f7626d0b807ee3658097519a82627285c0f13d021d421cd188f077 WatchSource:0}: Error finding container 8990d760b0f7626d0b807ee3658097519a82627285c0f13d021d421cd188f077: Status 404 returned error can't find the container with id 8990d760b0f7626d0b807ee3658097519a82627285c0f13d021d421cd188f077 Dec 09 14:58:41 crc kubenswrapper[4735]: W1209 14:58:41.906863 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-8aee77c52f7f632ef27fda4b5dd9742e1897942fdfe3e5a60ac409cff8cb2239 WatchSource:0}: Error finding container 8aee77c52f7f632ef27fda4b5dd9742e1897942fdfe3e5a60ac409cff8cb2239: Status 404 returned error can't find the container with id 8aee77c52f7f632ef27fda4b5dd9742e1897942fdfe3e5a60ac409cff8cb2239 Dec 09 14:58:41 crc kubenswrapper[4735]: E1209 14:58:41.976424 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" interval="800ms" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.166940 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.168078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.168129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.168161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.168208 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:58:42 crc kubenswrapper[4735]: E1209 14:58:42.168645 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.226:6443: connect: connection refused" node="crc" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.370708 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.417747 4735 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426" exitCode=0 Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.417802 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426"} Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.418018 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8aee77c52f7f632ef27fda4b5dd9742e1897942fdfe3e5a60ac409cff8cb2239"} Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.418141 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.419386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.419422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.419434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.419955 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4"} Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.420007 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8990d760b0f7626d0b807ee3658097519a82627285c0f13d021d421cd188f077"} Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.421836 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218" exitCode=0 Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.421902 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218"} Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.421944 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"059df7a26029fe5f0551f172d1dccc126f3a271d1e82d36a71ba45b8cca8e0af"} Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.422037 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.422820 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.422849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.422862 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.423533 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6" exitCode=0 Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.423605 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6"} Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.423639 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"728cd348c564af0c1ecb5a9e642b6ff69732b6d9a9cdffed3a0e9961ceaace78"} Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.423807 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.424695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.424723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.424736 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.425947 4735 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ef2e1827a5e2a50274e6d913621a59a232afe1b501a4625032924323ef5be7fc" exitCode=0 Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.425978 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ef2e1827a5e2a50274e6d913621a59a232afe1b501a4625032924323ef5be7fc"} Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.426022 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d73e5af88f62b62b0b7d9d0679692dc3b3a53b8f9332550675b0c7b77d987234"} Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.426078 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.426091 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.427000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.427024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.427038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.427303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.427331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.427343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4735]: W1209 14:58:42.529703 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 14:58:42 crc kubenswrapper[4735]: E1209 14:58:42.529787 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.25.226:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:42 crc kubenswrapper[4735]: W1209 14:58:42.560447 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 14:58:42 crc kubenswrapper[4735]: E1209 14:58:42.560533 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.25.226:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:42 crc kubenswrapper[4735]: W1209 14:58:42.671124 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 14:58:42 crc kubenswrapper[4735]: E1209 14:58:42.671216 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.25.226:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:42 crc kubenswrapper[4735]: E1209 14:58:42.777752 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" interval="1.6s" Dec 09 14:58:42 crc kubenswrapper[4735]: W1209 14:58:42.939970 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 14:58:42 crc kubenswrapper[4735]: E1209 14:58:42.940058 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.25.226:6443: connect: connection refused" logger="UnhandledError" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.969005 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.972394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.972435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.972443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:42 crc kubenswrapper[4735]: I1209 14:58:42.972467 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:58:42 crc kubenswrapper[4735]: E1209 14:58:42.972822 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.25.226:6443: connect: connection refused" node="crc" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.435235 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"20a0407c8ce274843a0d89ba655e67a8f1a246ceea6869b4ca13bcfc26c8c0b3"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.435284 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"45535b8bcb1404124d926d1d9c0ace548706f561f285fa378235ff9cac720b69"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.435294 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"46e14d22dd898b2543aa9e339ec2faab8dd457a5aee42ebd1948a9d12e73fc0d"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.435608 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.436242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.436266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.436274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.437775 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.437824 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.437835 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.437791 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.438401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.438421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.438429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.440352 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.440388 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.440401 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.440409 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.440417 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.440474 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.441072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.441095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.441104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.442578 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de" exitCode=0 Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.442617 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.442678 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.443237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.443257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.443266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.444859 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1232718eb0091e7bbefa0b835e1c261d53f50fd142e4b6b66354e7564ad61ec3"} Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.444909 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.445386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.445405 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:43 crc kubenswrapper[4735]: I1209 14:58:43.445415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.448950 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20" exitCode=0 Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.449057 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.449041 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20"} Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.449772 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.450113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.450149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.450160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.450881 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.450900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.450912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.573334 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.574134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.574169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.574182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.574204 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:58:44 crc kubenswrapper[4735]: I1209 14:58:44.833245 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.017859 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.276859 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.276983 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.277018 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.277904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.277933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.277942 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.455080 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e"} Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.455127 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.455133 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77"} Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.455146 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019"} Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.455156 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494"} Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.455164 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c"} Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.455314 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.455801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.455826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.455836 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.456610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.456631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:45 crc kubenswrapper[4735]: I1209 14:58:45.456639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.072029 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.072133 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.073056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.073089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.073100 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.232105 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.457814 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.457824 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.457925 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.459239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.459262 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.459266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.459287 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.459314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.459327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.459272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.459294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.459401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:46 crc kubenswrapper[4735]: I1209 14:58:46.706455 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:47 crc kubenswrapper[4735]: I1209 14:58:47.184341 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 09 14:58:47 crc kubenswrapper[4735]: I1209 14:58:47.461248 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:47 crc kubenswrapper[4735]: I1209 14:58:47.461283 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:47 crc kubenswrapper[4735]: I1209 14:58:47.462141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4735]: I1209 14:58:47.462170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4735]: I1209 14:58:47.462182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:47 crc kubenswrapper[4735]: I1209 14:58:47.462752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:47 crc kubenswrapper[4735]: I1209 14:58:47.462794 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:47 crc kubenswrapper[4735]: I1209 14:58:47.462809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4735]: I1209 14:58:49.014715 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 09 14:58:49 crc kubenswrapper[4735]: I1209 14:58:49.014887 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:49 crc kubenswrapper[4735]: I1209 14:58:49.016114 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:49 crc kubenswrapper[4735]: I1209 14:58:49.016155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:49 crc kubenswrapper[4735]: I1209 14:58:49.016164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:49 crc kubenswrapper[4735]: I1209 14:58:49.707174 4735 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 09 14:58:49 crc kubenswrapper[4735]: I1209 14:58:49.707286 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 09 14:58:50 crc kubenswrapper[4735]: I1209 14:58:50.698837 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:50 crc kubenswrapper[4735]: I1209 14:58:50.699107 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:50 crc kubenswrapper[4735]: I1209 14:58:50.700744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:50 crc kubenswrapper[4735]: I1209 14:58:50.700790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:50 crc kubenswrapper[4735]: I1209 14:58:50.700800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:50 crc kubenswrapper[4735]: I1209 14:58:50.702889 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:51 crc kubenswrapper[4735]: I1209 14:58:51.468477 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:51 crc kubenswrapper[4735]: I1209 14:58:51.469316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:51 crc kubenswrapper[4735]: I1209 14:58:51.469356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:51 crc kubenswrapper[4735]: I1209 14:58:51.469368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:51 crc kubenswrapper[4735]: E1209 14:58:51.471347 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 09 14:58:52 crc kubenswrapper[4735]: I1209 14:58:52.316363 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:58:52 crc kubenswrapper[4735]: I1209 14:58:52.316682 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:52 crc kubenswrapper[4735]: I1209 14:58:52.318241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:52 crc kubenswrapper[4735]: I1209 14:58:52.318285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:52 crc kubenswrapper[4735]: I1209 14:58:52.318293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:53 crc kubenswrapper[4735]: I1209 14:58:53.370934 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 09 14:58:53 crc kubenswrapper[4735]: I1209 14:58:53.569730 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 09 14:58:53 crc kubenswrapper[4735]: I1209 14:58:53.569830 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 09 14:58:53 crc kubenswrapper[4735]: I1209 14:58:53.578387 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 09 14:58:53 crc kubenswrapper[4735]: I1209 14:58:53.578473 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 09 14:58:55 crc kubenswrapper[4735]: I1209 14:58:55.022280 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:55 crc kubenswrapper[4735]: I1209 14:58:55.022407 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:55 crc kubenswrapper[4735]: I1209 14:58:55.023446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:55 crc kubenswrapper[4735]: I1209 14:58:55.023491 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:55 crc kubenswrapper[4735]: I1209 14:58:55.023501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4735]: I1209 14:58:56.237103 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:56 crc kubenswrapper[4735]: I1209 14:58:56.237511 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:56 crc kubenswrapper[4735]: I1209 14:58:56.240281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4735]: I1209 14:58:56.240345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4735]: I1209 14:58:56.240355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:56 crc kubenswrapper[4735]: I1209 14:58:56.241019 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:58:56 crc kubenswrapper[4735]: I1209 14:58:56.478081 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 14:58:56 crc kubenswrapper[4735]: I1209 14:58:56.478134 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:58:56 crc kubenswrapper[4735]: I1209 14:58:56.479105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:58:56 crc kubenswrapper[4735]: I1209 14:58:56.479142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:58:56 crc kubenswrapper[4735]: I1209 14:58:56.479151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:58:58 crc kubenswrapper[4735]: E1209 14:58:58.565241 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.569780 4735 trace.go:236] Trace[1216191377]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 14:58:45.752) (total time: 12817ms): Dec 09 14:58:58 crc kubenswrapper[4735]: Trace[1216191377]: ---"Objects listed" error: 12817ms (14:58:58.569) Dec 09 14:58:58 crc kubenswrapper[4735]: Trace[1216191377]: [12.817451213s] [12.817451213s] END Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.569803 4735 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.570612 4735 trace.go:236] Trace[37020932]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 14:58:45.225) (total time: 13344ms): Dec 09 14:58:58 crc kubenswrapper[4735]: Trace[37020932]: ---"Objects listed" error: 13344ms (14:58:58.570) Dec 09 14:58:58 crc kubenswrapper[4735]: Trace[37020932]: [13.344724192s] [13.344724192s] END Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.570628 4735 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.570743 4735 trace.go:236] Trace[1344926757]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 14:58:44.534) (total time: 14036ms): Dec 09 14:58:58 crc kubenswrapper[4735]: Trace[1344926757]: ---"Objects listed" error: 14036ms (14:58:58.570) Dec 09 14:58:58 crc kubenswrapper[4735]: Trace[1344926757]: [14.036591803s] [14.036591803s] END Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.570770 4735 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.570922 4735 trace.go:236] Trace[263544068]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Dec-2025 14:58:45.796) (total time: 12774ms): Dec 09 14:58:58 crc kubenswrapper[4735]: Trace[263544068]: ---"Objects listed" error: 12774ms (14:58:58.570) Dec 09 14:58:58 crc kubenswrapper[4735]: Trace[263544068]: [12.774607636s] [12.774607636s] END Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.570940 4735 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.570975 4735 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 09 14:58:58 crc kubenswrapper[4735]: E1209 14:58:58.571480 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.854782 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37854->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.854838 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37862->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.854885 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37854->192.168.126.11:17697: read: connection reset by peer" Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.854964 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37862->192.168.126.11:17697: read: connection reset by peer" Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.855443 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 09 14:58:58 crc kubenswrapper[4735]: I1209 14:58:58.855504 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.034335 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.042494 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.369655 4735 apiserver.go:52] "Watching apiserver" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.374656 4735 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.375293 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7qhfd","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-machine-config-operator/machine-config-daemon-t5lmh","openshift-multus/multus-additional-cni-plugins-qvmkc","openshift-multus/multus-xnf8f","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-node-qblcd"] Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.375721 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.375906 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.376010 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.376144 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.376148 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.376262 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.376206 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.376346 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.376373 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7qhfd" Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.377537 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.378592 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.378838 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.378919 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.379496 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.381154 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.381329 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.381481 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.381684 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.381722 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.381734 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.381922 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.382024 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.382121 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.382327 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.382586 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.382606 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.382758 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.382841 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.383013 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.383055 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.383265 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.385921 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.387493 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.387952 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.388248 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.388308 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.388318 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.388245 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.388472 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.388495 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.388563 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.388627 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.388672 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.388680 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.388819 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.401059 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.408333 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.414992 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.425372 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.432850 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.442593 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.450434 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.460440 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.462335 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.464065 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.470085 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.475170 4735 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.479990 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480029 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480071 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480099 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480122 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480142 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.479556 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480168 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480437 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480459 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480481 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480502 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480543 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480569 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480588 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480604 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480624 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480642 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480663 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480678 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480696 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480712 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480730 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480744 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480762 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480792 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480808 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480827 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480843 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480860 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480874 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480890 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480906 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480929 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480958 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480978 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480996 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481010 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481024 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481041 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481056 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481070 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481163 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481181 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481205 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481228 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481245 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481261 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481281 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481299 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481314 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481334 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481352 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481369 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481384 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481402 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481419 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481436 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481457 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481473 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481488 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481505 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481546 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481563 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481578 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481594 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481610 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481626 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481646 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481672 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481689 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481706 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481721 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481736 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481752 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481847 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481920 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481941 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481964 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481986 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482011 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482035 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482054 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482074 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482091 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482120 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482140 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480719 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482158 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482182 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482205 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482224 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482245 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482264 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482283 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482300 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482319 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482342 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482359 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482380 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482400 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482419 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482437 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482457 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482509 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482542 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482563 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482585 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482601 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482621 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482642 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482662 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482681 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482705 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482726 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482743 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482771 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482794 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482811 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482833 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482853 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482872 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482889 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482908 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482929 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482946 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482966 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482986 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483006 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483027 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483050 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483077 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483096 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483116 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483135 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483154 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483175 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483196 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483226 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483248 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483268 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483287 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483308 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483825 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483855 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483873 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483895 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483916 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483935 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483954 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483974 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483995 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484015 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484036 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484058 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484078 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484098 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484119 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484138 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484156 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484177 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484197 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484215 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484237 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484257 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484275 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484297 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484318 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484337 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484359 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484380 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484398 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484418 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484437 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484460 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484477 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484502 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484538 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.480928 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.485479 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481113 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481333 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481377 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.481625 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482142 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.482703 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.483195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484533 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484481 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484565 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484916 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.484993 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.485186 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.485197 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.485225 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.485394 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.487006 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.487363 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.487588 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.487600 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.487599 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.487757 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.487790 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.487956 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.487910 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.488117 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.488223 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.488193 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.488280 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.488414 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.488426 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.488587 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.488741 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.488755 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.488900 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.488911 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.489060 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.489133 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.489309 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.489453 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.489619 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.490076 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.490167 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.490348 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.490444 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.490528 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.490556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.490640 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:58:59.990618671 +0000 UTC m=+18.915457298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.490709 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.490808 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.490790 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.490874 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.490896 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.492736 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.492793 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.492839 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.492845 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.492970 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.492994 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.493093 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.493451 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.493546 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.493742 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.493987 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.494220 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.494328 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.494358 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.494614 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.494834 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.494839 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.495350 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.495478 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.495731 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.496104 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.496332 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.496363 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.496400 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.496629 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.496706 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.496713 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.497746 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.498127 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.498147 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.498185 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.498613 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.499008 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.499169 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.499190 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.499308 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.499445 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.499552 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.499684 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.499817 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.494345 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.500080 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.500123 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.500157 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.500411 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.500711 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.500824 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.500906 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.500925 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.501139 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.501340 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.501554 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.501601 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.501846 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.502423 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.502432 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.502585 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.502953 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.503014 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.503236 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.504258 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.504659 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.504911 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.504922 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.505100 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.505253 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.505746 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.506130 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.506446 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.506684 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.506872 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.506879 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.506992 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.507047 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.507134 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.507216 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.507277 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.507340 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.507402 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.507463 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.507542 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.507614 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.507662 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.507989 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.508163 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.508242 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.508322 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.508389 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.508450 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.508496 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.508522 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.509041 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.509070 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.509091 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.509148 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.509174 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-multus-socket-dir-parent\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.509193 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-run-netns\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.509211 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdgvn\" (UniqueName: \"kubernetes.io/projected/9617623e-09bb-4eb1-9b58-025df7afa461-kube-api-access-tdgvn\") pod \"node-resolver-7qhfd\" (UID: \"9617623e-09bb-4eb1-9b58-025df7afa461\") " pod="openshift-dns/node-resolver-7qhfd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510808 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510843 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-kubelet\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510897 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-log-socket\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510917 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-bin\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510936 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-config\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510954 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-script-lib\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510970 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn6dw\" (UniqueName: \"kubernetes.io/projected/9374566a-4662-4e98-ae18-6f52468332b5-kube-api-access-fn6dw\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510991 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511008 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-etc-openvswitch\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511029 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511047 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511066 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511086 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511104 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-env-overrides\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511124 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-run-k8s-cni-cncf-io\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511205 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/67d17a09-b547-49cf-8195-5af12413f51c-multus-daemon-config\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511227 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-etc-kubernetes\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-slash\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511317 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-var-lib-openvswitch\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511358 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-openvswitch\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511378 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-mcd-auth-proxy-config\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511421 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wmq2\" (UniqueName: \"kubernetes.io/projected/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-kube-api-access-4wmq2\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511444 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-multus-conf-dir\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511468 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511545 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-rootfs\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511584 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-cnibin\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511598 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-os-release\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511615 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511650 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511667 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511685 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-system-cni-dir\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511700 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/67d17a09-b547-49cf-8195-5af12413f51c-cni-binary-copy\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511718 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-node-log\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511732 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511748 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-var-lib-cni-bin\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-hostroot\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511791 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-systemd\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511806 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-netd\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511824 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-var-lib-cni-multus\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511841 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-netns\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511855 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9374566a-4662-4e98-ae18-6f52468332b5-ovn-node-metrics-cert\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511869 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9617623e-09bb-4eb1-9b58-025df7afa461-hosts-file\") pod \"node-resolver-7qhfd\" (UID: \"9617623e-09bb-4eb1-9b58-025df7afa461\") " pod="openshift-dns/node-resolver-7qhfd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511884 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-os-release\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511905 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-run-multus-certs\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511924 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ddf2068-c88d-46fd-97ac-eba38d91c642-cni-binary-copy\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511943 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-var-lib-kubelet\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511963 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511981 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512000 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ddf2068-c88d-46fd-97ac-eba38d91c642-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512021 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-multus-cni-dir\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512035 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-ovn\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512051 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512066 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-system-cni-dir\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512081 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512098 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-cnibin\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512113 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-systemd-units\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512127 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-proxy-tls\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512142 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxwb7\" (UniqueName: \"kubernetes.io/projected/67d17a09-b547-49cf-8195-5af12413f51c-kube-api-access-gxwb7\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512157 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpb6h\" (UniqueName: \"kubernetes.io/projected/5ddf2068-c88d-46fd-97ac-eba38d91c642-kube-api-access-fpb6h\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512243 4735 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512263 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512273 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512283 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512294 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512303 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512313 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512322 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512332 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512342 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512352 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512362 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512372 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512382 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512391 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512401 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512411 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512420 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512429 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512438 4735 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512447 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512455 4735 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512465 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512474 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512483 4735 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512507 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512532 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512542 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512552 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512561 4735 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512571 4735 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512580 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512588 4735 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512597 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512606 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512614 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512622 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512631 4735 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512641 4735 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512650 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512661 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512670 4735 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512680 4735 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512689 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512700 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512709 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512719 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512727 4735 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513129 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513149 4735 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513159 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513169 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513178 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513186 4735 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513195 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513204 4735 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513214 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513249 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513260 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513268 4735 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513277 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513306 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513315 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513324 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513500 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513526 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513536 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513987 4735 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514005 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514017 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514054 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514066 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514172 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514184 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514194 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514233 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514243 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514252 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514261 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514270 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514295 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514307 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514402 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514464 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514479 4735 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514489 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514497 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514491 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a" exitCode=255 Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514506 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514538 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514550 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514559 4735 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514570 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514579 4735 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514588 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514597 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514608 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514617 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514629 4735 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514638 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514648 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514657 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514666 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514702 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514716 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.515033 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.515099 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a"} Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.508952 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.509590 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.517623 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.515821 4735 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.517745 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.517729 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.515564 4735 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.517951 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.518166 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.518203 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.509885 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.509984 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510358 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510633 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510928 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.510983 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511778 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511903 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.511988 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512652 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.512888 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513635 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.513616 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514029 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514303 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.514506 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.520979 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514493 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514561 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514644 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514809 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.514835 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.515050 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.515177 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.515421 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.515500 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.515967 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.516213 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.515888 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.516978 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.517020 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.517008 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.517184 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.518241 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.518399 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.517582 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.518732 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.519049 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.519064 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.519258 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.519267 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.519299 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.509808 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.519378 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.523590 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.527593 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.527863 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.527880 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.527877 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.527891 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.528098 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.528611 4735 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.529096 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.529963 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.530288 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.530844 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.534203 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.534230 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.534243 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.535249 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.535709 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.535840 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.535847 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.535879 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.535917 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.536150 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.536548 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.537126 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.537145 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.537283 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.537773 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.537799 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538635 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538674 4735 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538694 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538709 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538721 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538738 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538753 4735 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538776 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538790 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538802 4735 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538816 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.538851 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.539067 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:00.039043884 +0000 UTC m=+18.963882512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.539134 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:00.039109999 +0000 UTC m=+18.963948617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.539238 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:00.039206481 +0000 UTC m=+18.964045109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:59 crc kubenswrapper[4735]: E1209 14:58:59.539268 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:00.039260672 +0000 UTC m=+18.964099300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539363 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539391 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539407 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539417 4735 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539426 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539435 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539444 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539453 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539462 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539470 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539479 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.539489 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.540023 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.547430 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.553745 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.554663 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.554791 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.559464 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.561753 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.565036 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.570170 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.579779 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.583006 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.583222 4735 scope.go:117] "RemoveContainer" containerID="9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.596371 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.604462 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.620470 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.629678 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.639735 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640074 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-env-overrides\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640115 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-run-k8s-cni-cncf-io\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640135 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/67d17a09-b547-49cf-8195-5af12413f51c-multus-daemon-config\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640152 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-etc-kubernetes\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640170 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-slash\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640191 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-var-lib-openvswitch\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-openvswitch\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-mcd-auth-proxy-config\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640242 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wmq2\" (UniqueName: \"kubernetes.io/projected/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-kube-api-access-4wmq2\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640260 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-multus-conf-dir\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640277 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-rootfs\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640290 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-cnibin\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640304 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-os-release\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640341 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-system-cni-dir\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640357 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/67d17a09-b547-49cf-8195-5af12413f51c-cni-binary-copy\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640371 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-node-log\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640390 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640405 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-var-lib-cni-bin\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640418 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-hostroot\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640433 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-systemd\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640449 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-netd\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640465 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-var-lib-cni-multus\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640480 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-netns\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640502 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9374566a-4662-4e98-ae18-6f52468332b5-ovn-node-metrics-cert\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640535 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9617623e-09bb-4eb1-9b58-025df7afa461-hosts-file\") pod \"node-resolver-7qhfd\" (UID: \"9617623e-09bb-4eb1-9b58-025df7afa461\") " pod="openshift-dns/node-resolver-7qhfd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640552 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-os-release\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640570 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-run-multus-certs\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640585 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ddf2068-c88d-46fd-97ac-eba38d91c642-cni-binary-copy\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640602 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-var-lib-kubelet\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640619 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ddf2068-c88d-46fd-97ac-eba38d91c642-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-multus-cni-dir\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.640651 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-ovn\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644366 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644399 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-system-cni-dir\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644426 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644444 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-cnibin\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644465 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-systemd-units\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644482 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-proxy-tls\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644500 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxwb7\" (UniqueName: \"kubernetes.io/projected/67d17a09-b547-49cf-8195-5af12413f51c-kube-api-access-gxwb7\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644537 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpb6h\" (UniqueName: \"kubernetes.io/projected/5ddf2068-c88d-46fd-97ac-eba38d91c642-kube-api-access-fpb6h\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644563 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ddf2068-c88d-46fd-97ac-eba38d91c642-cni-binary-copy\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642576 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-run-k8s-cni-cncf-io\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642948 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/67d17a09-b547-49cf-8195-5af12413f51c-multus-daemon-config\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642971 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-etc-kubernetes\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642991 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-slash\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644623 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-multus-socket-dir-parent\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642166 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/67d17a09-b547-49cf-8195-5af12413f51c-cni-binary-copy\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.643454 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-mcd-auth-proxy-config\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.643026 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-openvswitch\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642072 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-ovn\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.643709 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-rootfs\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.643737 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-cnibin\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.643784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-os-release\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.643803 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.643885 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-system-cni-dir\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.643920 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-netns\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644073 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-var-lib-kubelet\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644714 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644099 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9617623e-09bb-4eb1-9b58-025df7afa461-hosts-file\") pod \"node-resolver-7qhfd\" (UID: \"9617623e-09bb-4eb1-9b58-025df7afa461\") " pod="openshift-dns/node-resolver-7qhfd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644734 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-systemd-units\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642137 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644571 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-multus-socket-dir-parent\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644758 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-system-cni-dir\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.643689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-multus-conf-dir\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.643904 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-run-multus-certs\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644793 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-run-netns\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644081 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-var-lib-cni-multus\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642118 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-node-log\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644813 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdgvn\" (UniqueName: \"kubernetes.io/projected/9617623e-09bb-4eb1-9b58-025df7afa461-kube-api-access-tdgvn\") pod \"node-resolver-7qhfd\" (UID: \"9617623e-09bb-4eb1-9b58-025df7afa461\") " pod="openshift-dns/node-resolver-7qhfd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644843 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-kubelet\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644859 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-log-socket\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644893 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-bin\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644908 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-config\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644922 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-script-lib\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644937 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn6dw\" (UniqueName: \"kubernetes.io/projected/9374566a-4662-4e98-ae18-6f52468332b5-kube-api-access-fn6dw\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644960 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-etc-openvswitch\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644985 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645046 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645056 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645065 4735 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645074 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645085 4735 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645095 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645104 4735 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645112 4735 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645120 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645129 4735 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645137 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645146 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645155 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645164 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645175 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645184 4735 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645193 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645202 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645211 4735 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645219 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645228 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645236 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645245 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645253 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645262 4735 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645270 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645278 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645286 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645294 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645302 4735 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645309 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645317 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645325 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645333 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645346 4735 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645354 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645362 4735 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645370 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645379 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645388 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645396 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645405 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645414 4735 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645422 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645430 4735 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645438 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645446 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645455 4735 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645463 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645471 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645481 4735 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645489 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645499 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645507 4735 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645531 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645540 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645548 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645556 4735 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645564 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645572 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645585 4735 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645595 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645602 4735 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645610 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645620 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645629 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645637 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645662 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642242 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-netd\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642548 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-env-overrides\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642207 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-systemd\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642156 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-var-lib-cni-bin\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.645965 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-cnibin\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.643009 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-var-lib-openvswitch\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.646000 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-host-run-netns\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644178 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-multus-cni-dir\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.646057 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-bin\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.642173 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/67d17a09-b547-49cf-8195-5af12413f51c-hostroot\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.646102 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-kubelet\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.646123 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-log-socket\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.644219 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-os-release\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.646681 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-script-lib\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.646806 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-etc-openvswitch\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.646891 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ddf2068-c88d-46fd-97ac-eba38d91c642-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.647120 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-config\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.648810 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ddf2068-c88d-46fd-97ac-eba38d91c642-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.649057 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-proxy-tls\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.654851 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9374566a-4662-4e98-ae18-6f52468332b5-ovn-node-metrics-cert\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.666817 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdgvn\" (UniqueName: \"kubernetes.io/projected/9617623e-09bb-4eb1-9b58-025df7afa461-kube-api-access-tdgvn\") pod \"node-resolver-7qhfd\" (UID: \"9617623e-09bb-4eb1-9b58-025df7afa461\") " pod="openshift-dns/node-resolver-7qhfd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.674204 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpb6h\" (UniqueName: \"kubernetes.io/projected/5ddf2068-c88d-46fd-97ac-eba38d91c642-kube-api-access-fpb6h\") pod \"multus-additional-cni-plugins-qvmkc\" (UID: \"5ddf2068-c88d-46fd-97ac-eba38d91c642\") " pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.674356 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.674866 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxwb7\" (UniqueName: \"kubernetes.io/projected/67d17a09-b547-49cf-8195-5af12413f51c-kube-api-access-gxwb7\") pod \"multus-xnf8f\" (UID: \"67d17a09-b547-49cf-8195-5af12413f51c\") " pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.675958 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wmq2\" (UniqueName: \"kubernetes.io/projected/9700326d-c8d3-42a5-8521-b0fab6ca8ffe-kube-api-access-4wmq2\") pod \"machine-config-daemon-t5lmh\" (UID: \"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\") " pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.682614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn6dw\" (UniqueName: \"kubernetes.io/projected/9374566a-4662-4e98-ae18-6f52468332b5-kube-api-access-fn6dw\") pod \"ovnkube-node-qblcd\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.693780 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.702268 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.709270 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.714639 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.719842 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7qhfd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.724998 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.731329 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.735674 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:58:59 crc kubenswrapper[4735]: I1209 14:58:59.740008 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xnf8f" Dec 09 14:58:59 crc kubenswrapper[4735]: W1209 14:58:59.761826 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9617623e_09bb_4eb1_9b58_025df7afa461.slice/crio-b41585550badd3af7873936981330c74cf4c3533ed2e7a2c615ed65ef7b04181 WatchSource:0}: Error finding container b41585550badd3af7873936981330c74cf4c3533ed2e7a2c615ed65ef7b04181: Status 404 returned error can't find the container with id b41585550badd3af7873936981330c74cf4c3533ed2e7a2c615ed65ef7b04181 Dec 09 14:58:59 crc kubenswrapper[4735]: W1209 14:58:59.771739 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ddf2068_c88d_46fd_97ac_eba38d91c642.slice/crio-f7d58c71a63ec0a556223e5fe1a0a976bdb0a8b49c263cea0ca7d611a5f16789 WatchSource:0}: Error finding container f7d58c71a63ec0a556223e5fe1a0a976bdb0a8b49c263cea0ca7d611a5f16789: Status 404 returned error can't find the container with id f7d58c71a63ec0a556223e5fe1a0a976bdb0a8b49c263cea0ca7d611a5f16789 Dec 09 14:58:59 crc kubenswrapper[4735]: W1209 14:58:59.777599 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9374566a_4662_4e98_ae18_6f52468332b5.slice/crio-d35f7d254aa9fb6aba258457e3c45e02bf100e9bc3af5cb8d9fb6f718ec6db12 WatchSource:0}: Error finding container d35f7d254aa9fb6aba258457e3c45e02bf100e9bc3af5cb8d9fb6f718ec6db12: Status 404 returned error can't find the container with id d35f7d254aa9fb6aba258457e3c45e02bf100e9bc3af5cb8d9fb6f718ec6db12 Dec 09 14:58:59 crc kubenswrapper[4735]: W1209 14:58:59.779198 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67d17a09_b547_49cf_8195_5af12413f51c.slice/crio-e899955cc0ca0a69f6e7e1441ba61776bf18c0157e09eb7e83944047ca9b0cdd WatchSource:0}: Error finding container e899955cc0ca0a69f6e7e1441ba61776bf18c0157e09eb7e83944047ca9b0cdd: Status 404 returned error can't find the container with id e899955cc0ca0a69f6e7e1441ba61776bf18c0157e09eb7e83944047ca9b0cdd Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.048701 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.048830 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.048857 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.048883 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.048901 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049011 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049025 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049037 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049084 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:01.049069504 +0000 UTC m=+19.973908133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049102 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049125 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:59:01.049099873 +0000 UTC m=+19.973938500 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049139 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049165 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049170 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:01.049152721 +0000 UTC m=+19.973991349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049177 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049175 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049230 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:01.049212374 +0000 UTC m=+19.974051003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.049269 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:01.049249293 +0000 UTC m=+19.974087922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.519790 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.521629 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.522426 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.523502 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnf8f" event={"ID":"67d17a09-b547-49cf-8195-5af12413f51c","Type":"ContainerStarted","Data":"8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.523537 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnf8f" event={"ID":"67d17a09-b547-49cf-8195-5af12413f51c","Type":"ContainerStarted","Data":"e899955cc0ca0a69f6e7e1441ba61776bf18c0157e09eb7e83944047ca9b0cdd"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.525145 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.525169 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.525178 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a65ce6c4a314b3086454a6ad31f8b7306332af259e42e614fce00982e82adf09"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.526609 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ddf2068-c88d-46fd-97ac-eba38d91c642" containerID="705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53" exitCode=0 Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.526679 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" event={"ID":"5ddf2068-c88d-46fd-97ac-eba38d91c642","Type":"ContainerDied","Data":"705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.526714 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" event={"ID":"5ddf2068-c88d-46fd-97ac-eba38d91c642","Type":"ContainerStarted","Data":"f7d58c71a63ec0a556223e5fe1a0a976bdb0a8b49c263cea0ca7d611a5f16789"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.527871 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7qhfd" event={"ID":"9617623e-09bb-4eb1-9b58-025df7afa461","Type":"ContainerStarted","Data":"6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.527910 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7qhfd" event={"ID":"9617623e-09bb-4eb1-9b58-025df7afa461","Type":"ContainerStarted","Data":"b41585550badd3af7873936981330c74cf4c3533ed2e7a2c615ed65ef7b04181"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.529219 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6a5dd6760ca145a24e6f446b85f88af777ded7a7722e3248a6518e6316a975de"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.530858 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.530912 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e6240846ba5a0f09b4d56022a39eb207d01e59eb62a1d712a25b289988ac6046"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.532266 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4" exitCode=0 Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.532327 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.532344 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"d35f7d254aa9fb6aba258457e3c45e02bf100e9bc3af5cb8d9fb6f718ec6db12"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.534077 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.534100 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.534110 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"eb2f4552587dca01e9e14a6e228db7e85be2db3ad7f4eff1ed004ab6ff8d235b"} Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.534647 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: E1209 14:59:00.543285 4735 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.548975 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.559829 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.577357 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.590137 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.600364 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.611325 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.621540 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.635119 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.652739 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.669639 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.681577 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.693318 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.704721 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.719725 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.729988 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.740607 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.753012 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.764598 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.768988 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-x5f7x"] Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.769347 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x5f7x" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.770415 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.770820 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.770822 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.771084 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.774630 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.782709 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.791850 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.811813 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.828778 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.843716 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.854912 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9edd7b0-a112-42be-b351-018a9f9c68e3-serviceca\") pod \"node-ca-x5f7x\" (UID: \"a9edd7b0-a112-42be-b351-018a9f9c68e3\") " pod="openshift-image-registry/node-ca-x5f7x" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.854949 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9edd7b0-a112-42be-b351-018a9f9c68e3-host\") pod \"node-ca-x5f7x\" (UID: \"a9edd7b0-a112-42be-b351-018a9f9c68e3\") " pod="openshift-image-registry/node-ca-x5f7x" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.854971 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnv8t\" (UniqueName: \"kubernetes.io/projected/a9edd7b0-a112-42be-b351-018a9f9c68e3-kube-api-access-hnv8t\") pod \"node-ca-x5f7x\" (UID: \"a9edd7b0-a112-42be-b351-018a9f9c68e3\") " pod="openshift-image-registry/node-ca-x5f7x" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.862666 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.879255 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.893927 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.908890 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.925855 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.940034 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.956440 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9edd7b0-a112-42be-b351-018a9f9c68e3-serviceca\") pod \"node-ca-x5f7x\" (UID: \"a9edd7b0-a112-42be-b351-018a9f9c68e3\") " pod="openshift-image-registry/node-ca-x5f7x" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.956478 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9edd7b0-a112-42be-b351-018a9f9c68e3-host\") pod \"node-ca-x5f7x\" (UID: \"a9edd7b0-a112-42be-b351-018a9f9c68e3\") " pod="openshift-image-registry/node-ca-x5f7x" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.956501 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnv8t\" (UniqueName: \"kubernetes.io/projected/a9edd7b0-a112-42be-b351-018a9f9c68e3-kube-api-access-hnv8t\") pod \"node-ca-x5f7x\" (UID: \"a9edd7b0-a112-42be-b351-018a9f9c68e3\") " pod="openshift-image-registry/node-ca-x5f7x" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.957809 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a9edd7b0-a112-42be-b351-018a9f9c68e3-serviceca\") pod \"node-ca-x5f7x\" (UID: \"a9edd7b0-a112-42be-b351-018a9f9c68e3\") " pod="openshift-image-registry/node-ca-x5f7x" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.957886 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9edd7b0-a112-42be-b351-018a9f9c68e3-host\") pod \"node-ca-x5f7x\" (UID: \"a9edd7b0-a112-42be-b351-018a9f9c68e3\") " pod="openshift-image-registry/node-ca-x5f7x" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.959997 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.970912 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.980978 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnv8t\" (UniqueName: \"kubernetes.io/projected/a9edd7b0-a112-42be-b351-018a9f9c68e3-kube-api-access-hnv8t\") pod \"node-ca-x5f7x\" (UID: \"a9edd7b0-a112-42be-b351-018a9f9c68e3\") " pod="openshift-image-registry/node-ca-x5f7x" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.983151 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:00 crc kubenswrapper[4735]: I1209 14:59:00.995361 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:00Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.005013 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.015225 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.042651 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.057531 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.057698 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:59:03.057676933 +0000 UTC m=+21.982515561 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.057820 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.057901 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.057996 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.058066 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.057998 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.058034 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.058222 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:03.058200268 +0000 UTC m=+21.983038897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.058261 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:03.058251666 +0000 UTC m=+21.983090294 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.058296 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.058315 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.058329 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.058372 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:03.058358758 +0000 UTC m=+21.983197375 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.058532 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.058598 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.058657 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.058759 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:03.058735868 +0000 UTC m=+21.983574496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.085843 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.104636 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x5f7x" Dec 09 14:59:01 crc kubenswrapper[4735]: W1209 14:59:01.117705 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9edd7b0_a112_42be_b351_018a9f9c68e3.slice/crio-334f6ff30af810c5d8c753f2e0e4a56e53eef69b1e1f2a3e153a7380f9162116 WatchSource:0}: Error finding container 334f6ff30af810c5d8c753f2e0e4a56e53eef69b1e1f2a3e153a7380f9162116: Status 404 returned error can't find the container with id 334f6ff30af810c5d8c753f2e0e4a56e53eef69b1e1f2a3e153a7380f9162116 Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.121608 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.159373 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.198561 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.240958 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.413656 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.413789 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.414094 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.414150 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.414276 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.414323 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.418067 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.418802 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.419479 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.420123 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.420755 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.421274 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.421882 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.422428 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.423408 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.429119 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.429747 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.430837 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.431329 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.432235 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.432166 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.432884 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.433737 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.434290 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.434701 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.435578 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.436142 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.436624 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.437600 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.438047 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.439052 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.439485 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.440562 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.441177 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.441701 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.442683 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.443176 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.444182 4735 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.444281 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.445798 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.446640 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.447069 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.448764 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.449684 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.450202 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.450183 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.451121 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.451736 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.452190 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.453073 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.453985 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.454586 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.455328 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.455842 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.456640 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.457277 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.458071 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.458506 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.458947 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.459497 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.459747 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.460247 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.461040 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.468762 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.485605 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.501065 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.522982 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.538189 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x5f7x" event={"ID":"a9edd7b0-a112-42be-b351-018a9f9c68e3","Type":"ContainerStarted","Data":"9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025"} Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.538236 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x5f7x" event={"ID":"a9edd7b0-a112-42be-b351-018a9f9c68e3","Type":"ContainerStarted","Data":"334f6ff30af810c5d8c753f2e0e4a56e53eef69b1e1f2a3e153a7380f9162116"} Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.541268 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0"} Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.541298 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2"} Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.541310 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596"} Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.541320 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0"} Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.541330 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1"} Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.541337 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea"} Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.545168 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ddf2068-c88d-46fd-97ac-eba38d91c642" containerID="ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba" exitCode=0 Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.545491 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" event={"ID":"5ddf2068-c88d-46fd-97ac-eba38d91c642","Type":"ContainerDied","Data":"ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba"} Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.559699 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.597296 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.638834 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.681815 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.717636 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.756466 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.772573 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.774499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.774559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.774571 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.774696 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.798574 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.851654 4735 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.851993 4735 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.852983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.853019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.853030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.853051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.853061 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.868217 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.870996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.871031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.871040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.871054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.871064 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.880492 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.881978 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.883152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.883190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.883201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.883216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.883225 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.891287 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.893647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.893747 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.893826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.893898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.893970 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.902892 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.905549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.905584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.905600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.905614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.905624 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.913570 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: E1209 14:59:01.913681 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.914930 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.914965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.914976 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.914990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.915004 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:01Z","lastTransitionTime":"2025-12-09T14:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.922444 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:01 crc kubenswrapper[4735]: I1209 14:59:01.960329 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:01Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.005031 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.017328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.017359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.017369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.017385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.017395 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.039473 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.079401 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.119651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.119689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.119703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.119720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.119733 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.121082 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.158267 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.200243 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.222140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.222365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.222448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.222544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.222614 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.240755 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.279170 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.317464 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.329254 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.329313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.329330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.329349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.329372 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.358574 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.398766 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.431669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.431718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.431729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.431747 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.431760 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.438284 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.478622 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.534530 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.534575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.534589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.534607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.534620 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.550662 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ddf2068-c88d-46fd-97ac-eba38d91c642" containerID="6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee" exitCode=0 Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.550717 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" event={"ID":"5ddf2068-c88d-46fd-97ac-eba38d91c642","Type":"ContainerDied","Data":"6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.553661 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.565863 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.579106 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.598174 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.637062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.637095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.637105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.637121 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.637132 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.642834 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.678742 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.719401 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.739584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.739638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.739650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.739670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.739685 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.759956 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.799572 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.841298 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.841740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.841786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.841801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.841822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.841839 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.876966 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.919702 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.944532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.944570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.944581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.944598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.944611 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:02Z","lastTransitionTime":"2025-12-09T14:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.958416 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:02 crc kubenswrapper[4735]: I1209 14:59:02.999232 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:02Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.038077 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.046925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.046961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.046971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.047009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.047022 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.078761 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.078891 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.078930 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.078968 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:59:07.078932119 +0000 UTC m=+26.003770757 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079051 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079098 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079118 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079126 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:07.079101619 +0000 UTC m=+26.003940257 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079136 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.079049 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079157 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079187 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:07.07917098 +0000 UTC m=+26.004009607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079207 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:07.079197098 +0000 UTC m=+26.004035726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.079234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079394 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079420 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079437 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.079485 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:07.079476865 +0000 UTC m=+26.004315503 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.083494 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.116587 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.149677 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.149713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.149723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.149740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.149751 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.158543 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.198205 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.239001 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.251543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.251580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.251612 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.251631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.251642 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.279606 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.321418 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.354319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.354364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.354375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.354397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.354421 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.362805 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.399058 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.413214 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.413278 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.413298 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.413346 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.413439 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:03 crc kubenswrapper[4735]: E1209 14:59:03.413717 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.438864 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.456996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.457033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.457044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.457063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.457074 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.478476 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.518143 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.558766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.558819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.558830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.558847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.559114 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.560110 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.560587 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.562012 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ddf2068-c88d-46fd-97ac-eba38d91c642" containerID="498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d" exitCode=0 Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.562193 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" event={"ID":"5ddf2068-c88d-46fd-97ac-eba38d91c642","Type":"ContainerDied","Data":"498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.597825 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.637161 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.662429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.662460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.662472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.662486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.662497 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.682433 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.719996 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.759132 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.764757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.764816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.764827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.764850 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.764862 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.798670 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.838492 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.867497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.867559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.867570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.867592 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.867604 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.879768 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.917452 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.958601 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.970535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.970556 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.970565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.970579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.970590 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:03Z","lastTransitionTime":"2025-12-09T14:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:03 crc kubenswrapper[4735]: I1209 14:59:03.997208 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:03Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.039207 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.072797 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.072823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.072854 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.072873 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.072886 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.077139 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.121959 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.156828 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.175205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.175256 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.175267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.175281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.175292 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.197245 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.236452 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.277021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.277044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.277054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.277066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.277076 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.281432 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.378949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.378985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.378998 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.379012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.379023 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.480895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.480943 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.480953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.480972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.480986 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.568345 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ddf2068-c88d-46fd-97ac-eba38d91c642" containerID="be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91" exitCode=0 Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.568401 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" event={"ID":"5ddf2068-c88d-46fd-97ac-eba38d91c642","Type":"ContainerDied","Data":"be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91"} Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.582856 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.582926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.582941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.582959 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.583015 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.584113 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.595939 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.604489 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.614014 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.624286 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.634229 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.642313 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.651796 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.660792 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.677077 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.684793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.684822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.684832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.684847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.684858 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.717235 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.764890 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.787858 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.787907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.787924 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.787941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.787950 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.799344 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.838461 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.877727 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:04Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.890912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.890952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.890962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.890979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.890991 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.993690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.993725 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.993735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.993749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:04 crc kubenswrapper[4735]: I1209 14:59:04.993761 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:04Z","lastTransitionTime":"2025-12-09T14:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.095642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.095878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.095887 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.095901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.095909 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.197923 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.197968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.197978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.197994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.198007 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.299861 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.299896 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.299905 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.299917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.299926 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.401994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.402051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.402061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.402079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.402090 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.413376 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:05 crc kubenswrapper[4735]: E1209 14:59:05.413488 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.413382 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:05 crc kubenswrapper[4735]: E1209 14:59:05.413580 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.413382 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:05 crc kubenswrapper[4735]: E1209 14:59:05.413647 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.504562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.504599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.504610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.504626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.504636 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.573643 4735 generic.go:334] "Generic (PLEG): container finished" podID="5ddf2068-c88d-46fd-97ac-eba38d91c642" containerID="96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a" exitCode=0 Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.573707 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" event={"ID":"5ddf2068-c88d-46fd-97ac-eba38d91c642","Type":"ContainerDied","Data":"96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.578394 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"953be5b8c12530ee331e80f096762b6246ff5d0c1909b8ae863a7e4648c7003a"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.578714 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.578738 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.585383 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.597292 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.600248 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.602865 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.606277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.606303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.606312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.606323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.606333 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.609703 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.618857 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.627672 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.636609 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.646386 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.671724 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.697758 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.709362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.709586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.709597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.709612 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.709621 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.712340 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.722799 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.736156 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.744155 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.751818 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.761372 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.768630 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.778738 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.788382 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.796608 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.805813 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.813692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.813724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.813733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.813748 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.813756 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.818238 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.832694 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.841739 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.849942 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.877889 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.916389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.916424 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.916433 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.916447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.916457 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:05Z","lastTransitionTime":"2025-12-09T14:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.917751 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.958765 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:05 crc kubenswrapper[4735]: I1209 14:59:05.996801 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:05Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.018275 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.018311 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.018321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.018336 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.018346 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.037217 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.081377 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://953be5b8c12530ee331e80f096762b6246ff5d0c1909b8ae863a7e4648c7003a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.120066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.120098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.120107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.120125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.120137 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.222478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.222540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.222551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.222572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.222583 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.324688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.324724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.324734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.324748 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.324758 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.426696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.426733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.426742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.426754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.426766 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.528840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.528873 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.528882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.528894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.528904 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.584230 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" event={"ID":"5ddf2068-c88d-46fd-97ac-eba38d91c642","Type":"ContainerStarted","Data":"76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c"} Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.584307 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.594989 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.606028 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.615307 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.623866 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.630605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.630645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.630655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.630669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.630680 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.635385 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.644740 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.658581 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.667498 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.676414 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.685469 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.695713 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.703907 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.712006 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.720713 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.732343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.732374 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.732382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.732395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.732405 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.735038 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://953be5b8c12530ee331e80f096762b6246ff5d0c1909b8ae863a7e4648c7003a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:06Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.834094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.834129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.834139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.834153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.834162 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.937070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.937125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.937137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.937155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:06 crc kubenswrapper[4735]: I1209 14:59:06.937173 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:06Z","lastTransitionTime":"2025-12-09T14:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.039895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.039934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.039943 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.039962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.039978 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.120492 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.120613 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.120640 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.120667 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:59:15.120643708 +0000 UTC m=+34.045482347 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.120704 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.120745 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.120766 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.120783 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.120806 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.120845 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:15.120832675 +0000 UTC m=+34.045671303 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.120855 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.120893 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:15.120883671 +0000 UTC m=+34.045722309 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.120922 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.120986 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.121001 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.121011 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.121026 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:15.120997996 +0000 UTC m=+34.045836625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.121053 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:15.121043182 +0000 UTC m=+34.045881820 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.141993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.142029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.142038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.142052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.142062 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.244347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.244387 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.244397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.244414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.244427 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.347109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.347165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.347178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.347195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.347208 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.413870 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.413911 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.413951 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.414037 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.414126 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:07 crc kubenswrapper[4735]: E1209 14:59:07.414265 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.449982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.450023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.450034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.450049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.450059 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.552097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.552128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.552136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.552149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.552158 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.589434 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/0.log" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.592109 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="953be5b8c12530ee331e80f096762b6246ff5d0c1909b8ae863a7e4648c7003a" exitCode=1 Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.592182 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"953be5b8c12530ee331e80f096762b6246ff5d0c1909b8ae863a7e4648c7003a"} Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.592761 4735 scope.go:117] "RemoveContainer" containerID="953be5b8c12530ee331e80f096762b6246ff5d0c1909b8ae863a7e4648c7003a" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.612668 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.623205 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.637244 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.648992 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.655400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.655440 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.655450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.655472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.655483 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.659222 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.668597 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.676335 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.687632 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.701577 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://953be5b8c12530ee331e80f096762b6246ff5d0c1909b8ae863a7e4648c7003a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://953be5b8c12530ee331e80f096762b6246ff5d0c1909b8ae863a7e4648c7003a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:07Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 14:59:07.513619 6014 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1209 14:59:07.513627 6014 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1209 14:59:07.513664 6014 factory.go:656] Stopping watch factory\\\\nI1209 14:59:07.513681 6014 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1209 14:59:07.513695 6014 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1209 14:59:07.513700 6014 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 14:59:07.513710 6014 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 14:59:07.513719 6014 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 14:59:07.513862 6014 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 14:59:07.513925 6014 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1209 14:59:07.514009 6014 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.709685 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.719683 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.729724 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.738484 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.747399 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.757738 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.757774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.757783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.757809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.757820 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.758916 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:07Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.860279 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.860314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.860327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.860345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.860412 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.963425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.963476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.963485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.963503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:07 crc kubenswrapper[4735]: I1209 14:59:07.963537 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:07Z","lastTransitionTime":"2025-12-09T14:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.065779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.065828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.065840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.065856 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.065867 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.168025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.168070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.168079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.168097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.168109 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.270346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.270409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.270419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.270441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.270452 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.372908 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.372970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.372983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.373004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.373017 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.475643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.475726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.475738 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.475759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.475774 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.578248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.578290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.578300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.578318 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.578330 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.596663 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/1.log" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.597455 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/0.log" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.600704 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa" exitCode=1 Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.600822 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa"} Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.600916 4735 scope.go:117] "RemoveContainer" containerID="953be5b8c12530ee331e80f096762b6246ff5d0c1909b8ae863a7e4648c7003a" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.601449 4735 scope.go:117] "RemoveContainer" containerID="7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa" Dec 09 14:59:08 crc kubenswrapper[4735]: E1209 14:59:08.601661 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.613409 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.623477 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.635175 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.642908 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.653422 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.663951 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.672535 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.680504 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.680553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.680564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.680578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.680589 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.682006 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.695581 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.704077 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.713038 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.721756 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.735198 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://953be5b8c12530ee331e80f096762b6246ff5d0c1909b8ae863a7e4648c7003a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:07Z\\\",\\\"message\\\":\\\"andler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1209 14:59:07.513619 6014 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1209 14:59:07.513627 6014 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1209 14:59:07.513664 6014 factory.go:656] Stopping watch factory\\\\nI1209 14:59:07.513681 6014 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1209 14:59:07.513695 6014 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1209 14:59:07.513700 6014 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1209 14:59:07.513710 6014 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1209 14:59:07.513719 6014 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 14:59:07.513862 6014 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1209 14:59:07.513925 6014 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1209 14:59:07.514009 6014 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:08Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.16\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:08.248043 6137 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-controller per-node LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248056 6137 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-controller template LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248071 6137 services_controller.go:454] Service openshift-machine-config-operator/machine-config-controller for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1209 14:59:08.248070 6137 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.744308 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.751929 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:08Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.782843 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.782966 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.783029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.783097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.783152 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.886002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.886045 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.886057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.886076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.886089 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.988052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.988099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.988110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.988133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:08 crc kubenswrapper[4735]: I1209 14:59:08.988149 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:08Z","lastTransitionTime":"2025-12-09T14:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.090470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.090533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.090545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.090560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.090572 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.192559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.192606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.192619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.192638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.192651 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.294150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.294200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.294211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.294236 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.294249 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.396323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.396361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.396371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.396385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.396395 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.413823 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.413851 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.413916 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:09 crc kubenswrapper[4735]: E1209 14:59:09.414047 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:09 crc kubenswrapper[4735]: E1209 14:59:09.414167 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:09 crc kubenswrapper[4735]: E1209 14:59:09.414251 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.502450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.502483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.502496 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.502527 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.502541 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.603988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.604017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.604026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.604038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.604049 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.605168 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/1.log" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.608112 4735 scope.go:117] "RemoveContainer" containerID="7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa" Dec 09 14:59:09 crc kubenswrapper[4735]: E1209 14:59:09.608245 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.617659 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.626597 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.642157 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:08Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.16\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:08.248043 6137 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-controller per-node LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248056 6137 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-controller template LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248071 6137 services_controller.go:454] Service openshift-machine-config-operator/machine-config-controller for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1209 14:59:08.248070 6137 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.651065 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.662305 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.671447 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.682242 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.692485 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.700558 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.705296 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.705334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.705345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.705360 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.705371 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.710815 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.719500 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.729931 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.740320 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.755988 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.765418 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:09Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.807813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.807849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.807861 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.807876 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.807888 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.909986 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.910022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.910036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.910049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:09 crc kubenswrapper[4735]: I1209 14:59:09.910058 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:09Z","lastTransitionTime":"2025-12-09T14:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.012412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.012458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.012467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.012483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.012496 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.114487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.114539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.114549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.114561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.114571 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.216172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.216206 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.216216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.216227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.216238 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.318590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.318633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.318642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.318654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.318666 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.421091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.421126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.421136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.421148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.421157 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.522778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.522826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.522835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.522854 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.522876 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.625595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.625637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.625646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.625658 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.625668 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.727864 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.727907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.727917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.727933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.727946 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.830779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.830827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.830838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.830851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.830860 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.933478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.933554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.933566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.933580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:10 crc kubenswrapper[4735]: I1209 14:59:10.933601 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:10Z","lastTransitionTime":"2025-12-09T14:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.035361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.035494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.035809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.035906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.035969 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.138228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.138294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.138310 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.138334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.138350 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.210920 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw"] Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.211649 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.213542 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.213568 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.224743 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.234791 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.240401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.240442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.240457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.240478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.240493 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.245014 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.254020 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.264231 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.270958 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.284147 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.306162 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.321620 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.341778 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.342636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.342666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.342676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.342692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.342703 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.355111 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.362189 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.362244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.362266 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flmb9\" (UniqueName: \"kubernetes.io/projected/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-kube-api-access-flmb9\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.362461 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.362914 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.371233 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.379027 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.388499 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.401984 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:08Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.16\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:08.248043 6137 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-controller per-node LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248056 6137 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-controller template LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248071 6137 services_controller.go:454] Service openshift-machine-config-operator/machine-config-controller for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1209 14:59:08.248070 6137 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.413043 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.413072 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.413047 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:11 crc kubenswrapper[4735]: E1209 14:59:11.413167 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:11 crc kubenswrapper[4735]: E1209 14:59:11.413274 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:11 crc kubenswrapper[4735]: E1209 14:59:11.413400 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.428283 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.439500 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.444430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.444465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.444480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.444493 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.444505 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.452758 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.463141 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.463185 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flmb9\" (UniqueName: \"kubernetes.io/projected/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-kube-api-access-flmb9\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.463242 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.463240 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.463276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.463791 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.463907 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.469499 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.473826 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.476999 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flmb9\" (UniqueName: \"kubernetes.io/projected/d62a4496-0a3d-4e9f-a70a-7cf318f07dbf-kube-api-access-flmb9\") pod \"ovnkube-control-plane-749d76644c-h2gtw\" (UID: \"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.489654 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.498750 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.506895 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.515411 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.522338 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.529863 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:08Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.16\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:08.248043 6137 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-controller per-node LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248056 6137 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-controller template LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248071 6137 services_controller.go:454] Service openshift-machine-config-operator/machine-config-controller for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1209 14:59:08.248070 6137 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: W1209 14:59:11.534527 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd62a4496_0a3d_4e9f_a70a_7cf318f07dbf.slice/crio-8a5941df12339df9d785f7a740a41bcd72645dc418222ecebc8988b1f0fc7586 WatchSource:0}: Error finding container 8a5941df12339df9d785f7a740a41bcd72645dc418222ecebc8988b1f0fc7586: Status 404 returned error can't find the container with id 8a5941df12339df9d785f7a740a41bcd72645dc418222ecebc8988b1f0fc7586 Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.546596 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.546909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.546953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.546965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.546982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.546994 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.557940 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.577730 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.588643 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.600379 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.609440 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:11Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.613676 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" event={"ID":"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf","Type":"ContainerStarted","Data":"8a5941df12339df9d785f7a740a41bcd72645dc418222ecebc8988b1f0fc7586"} Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.649770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.649819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.649833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.649853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.649867 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.752069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.752110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.752119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.752135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.752147 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.855067 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.855123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.855133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.855154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.855166 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.958051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.958094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.958104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.958122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:11 crc kubenswrapper[4735]: I1209 14:59:11.958134 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:11Z","lastTransitionTime":"2025-12-09T14:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.060554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.060594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.060605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.060625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.060637 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.163012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.163050 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.163059 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.163073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.163082 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.203133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.203164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.203176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.203189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.203199 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: E1209 14:59:12.212677 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.217911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.218027 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.218037 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.218052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.218061 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: E1209 14:59:12.229671 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.232736 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.232791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.232803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.232830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.232840 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: E1209 14:59:12.241943 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.244827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.244854 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.244863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.244882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.244895 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: E1209 14:59:12.254380 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.256799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.256839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.256850 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.256862 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.256871 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: E1209 14:59:12.265971 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: E1209 14:59:12.266284 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.267585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.267615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.267625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.267636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.267645 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.369368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.369403 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.369413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.369426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.369436 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.471847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.471879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.471891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.471904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.471914 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.574054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.574263 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.574331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.574397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.574470 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.618834 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" event={"ID":"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf","Type":"ContainerStarted","Data":"5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.618876 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" event={"ID":"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf","Type":"ContainerStarted","Data":"5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.630437 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.639050 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.647352 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-jw8pr"] Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.647966 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:12 crc kubenswrapper[4735]: E1209 14:59:12.648035 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.649321 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.656766 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.665795 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.676435 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.676798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.676841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.676866 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.676882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.676893 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.686293 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.695169 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.702481 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.716062 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.725146 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.733622 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.741366 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.753658 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:08Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.16\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:08.248043 6137 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-controller per-node LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248056 6137 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-controller template LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248071 6137 services_controller.go:454] Service openshift-machine-config-operator/machine-config-controller for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1209 14:59:08.248070 6137 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.761374 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.767962 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.777160 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwlvf\" (UniqueName: \"kubernetes.io/projected/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-kube-api-access-kwlvf\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.777229 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.777254 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.778752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.778818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.778830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.778844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.778853 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.785953 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.793645 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.808096 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:08Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.16\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:08.248043 6137 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-controller per-node LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248056 6137 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-controller template LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248071 6137 services_controller.go:454] Service openshift-machine-config-operator/machine-config-controller for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1209 14:59:08.248070 6137 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.820669 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.829483 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.839338 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.848625 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.856783 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.864711 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.875784 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.877683 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.877758 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwlvf\" (UniqueName: \"kubernetes.io/projected/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-kube-api-access-kwlvf\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:12 crc kubenswrapper[4735]: E1209 14:59:12.877880 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:12 crc kubenswrapper[4735]: E1209 14:59:12.877954 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs podName:6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:13.377931981 +0000 UTC m=+32.302770609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs") pod "network-metrics-daemon-jw8pr" (UID: "6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.881374 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.881404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.881415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.881431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.881442 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.890497 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.891571 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwlvf\" (UniqueName: \"kubernetes.io/projected/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-kube-api-access-kwlvf\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.899164 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.907350 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.916351 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.925925 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.934500 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:12Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.983342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.983372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.983381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.983400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:12 crc kubenswrapper[4735]: I1209 14:59:12.983410 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:12Z","lastTransitionTime":"2025-12-09T14:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.086296 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.086332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.086345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.086361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.086373 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.188361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.188395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.188404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.188417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.188427 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.290248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.290275 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.290285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.290312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.290321 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.383075 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:13 crc kubenswrapper[4735]: E1209 14:59:13.383221 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:13 crc kubenswrapper[4735]: E1209 14:59:13.383277 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs podName:6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:14.383260035 +0000 UTC m=+33.308098662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs") pod "network-metrics-daemon-jw8pr" (UID: "6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.391674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.391696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.391705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.391717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.391726 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.413737 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.413787 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:13 crc kubenswrapper[4735]: E1209 14:59:13.413846 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.413871 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:13 crc kubenswrapper[4735]: E1209 14:59:13.413944 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:13 crc kubenswrapper[4735]: E1209 14:59:13.414108 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.494298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.494337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.494347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.494360 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.494372 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.596320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.596363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.596372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.596384 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.596394 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.698765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.698802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.698825 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.698839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.698851 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.801183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.801218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.801229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.801243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.801257 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.903909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.903961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.903972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.903984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:13 crc kubenswrapper[4735]: I1209 14:59:13.903992 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:13Z","lastTransitionTime":"2025-12-09T14:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.005919 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.005945 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.005954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.005967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.005975 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.108167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.108202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.108209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.108223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.108232 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.210063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.210111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.210121 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.210138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.210152 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.311692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.311750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.311761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.311776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.311787 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.392646 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:14 crc kubenswrapper[4735]: E1209 14:59:14.392772 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:14 crc kubenswrapper[4735]: E1209 14:59:14.392845 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs podName:6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:16.39282712 +0000 UTC m=+35.317665749 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs") pod "network-metrics-daemon-jw8pr" (UID: "6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.413334 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:14 crc kubenswrapper[4735]: E1209 14:59:14.413455 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.413616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.413646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.413656 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.413673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.413685 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.515641 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.515684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.515694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.515710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.515722 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.617566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.617598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.617609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.617623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.617633 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.719547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.719577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.719585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.719598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.719609 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.821619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.821682 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.821693 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.821710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.821721 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.923906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.923946 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.923955 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.923972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:14 crc kubenswrapper[4735]: I1209 14:59:14.923986 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:14Z","lastTransitionTime":"2025-12-09T14:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.026145 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.026179 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.026188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.026200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.026212 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.128030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.128064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.128072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.128088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.128098 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.199698 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.199789 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.199845 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 14:59:31.199813235 +0000 UTC m=+50.124651863 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.199878 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.199900 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.199912 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.199932 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.199957 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:31.199938532 +0000 UTC m=+50.124777160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.200004 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.200032 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.200032 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.200057 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.200069 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.200041 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:31.20003344 +0000 UTC m=+50.124872069 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.200045 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.200145 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.200121 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:31.20011256 +0000 UTC m=+50.124951188 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.200206 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:31.200191929 +0000 UTC m=+50.125030567 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.229746 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.229793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.229806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.229830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.229843 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.331162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.331190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.331202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.331274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.331294 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.413172 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.413280 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.413186 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.413368 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.413171 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:15 crc kubenswrapper[4735]: E1209 14:59:15.413434 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.433067 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.433119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.433133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.433143 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.433152 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.535078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.535114 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.535122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.535135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.535144 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.636806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.636841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.636851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.636865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.636874 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.738650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.738678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.738688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.738698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.738707 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.840912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.840967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.840978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.840991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.840999 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.943269 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.943303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.943313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.943326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:15 crc kubenswrapper[4735]: I1209 14:59:15.943335 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:15Z","lastTransitionTime":"2025-12-09T14:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.045413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.045437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.045448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.045460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.045468 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.075595 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.087500 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.094449 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.101238 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.113330 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.123314 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.132302 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.140332 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.147500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.147568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.147580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.147597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.147606 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.148775 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.163331 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.173203 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.181269 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.191321 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.200169 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.209412 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.216251 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.223627 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.236124 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:08Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.16\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:08.248043 6137 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-controller per-node LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248056 6137 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-controller template LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248071 6137 services_controller.go:454] Service openshift-machine-config-operator/machine-config-controller for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1209 14:59:08.248070 6137 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:16Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.249675 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.249702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.249712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.249725 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.249734 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.351156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.351179 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.351190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.351201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.351210 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.412244 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:16 crc kubenswrapper[4735]: E1209 14:59:16.412381 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:16 crc kubenswrapper[4735]: E1209 14:59:16.412442 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs podName:6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:20.412422009 +0000 UTC m=+39.337260638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs") pod "network-metrics-daemon-jw8pr" (UID: "6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.413083 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:16 crc kubenswrapper[4735]: E1209 14:59:16.413196 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.452870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.452903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.452915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.452925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.452932 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.554609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.554642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.554654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.554664 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.554674 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.656358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.656399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.656409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.656421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.656431 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.758685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.758720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.758729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.758742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.758753 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.861133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.861165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.861174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.861184 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.861192 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.962773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.962815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.962838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.962847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:16 crc kubenswrapper[4735]: I1209 14:59:16.962856 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:16Z","lastTransitionTime":"2025-12-09T14:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.065142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.065182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.065193 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.065207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.065216 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.167483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.167542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.167552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.167573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.167582 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.269484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.269558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.269568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.269581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.269590 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.371600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.371630 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.371664 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.371678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.371687 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.413199 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:17 crc kubenswrapper[4735]: E1209 14:59:17.413308 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.413563 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.413664 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:17 crc kubenswrapper[4735]: E1209 14:59:17.413776 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:17 crc kubenswrapper[4735]: E1209 14:59:17.413938 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.473717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.473747 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.473757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.473788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.473798 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.575886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.575910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.575917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.575971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.575982 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.678479 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.678530 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.678541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.678552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.678561 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.781709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.781747 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.781759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.781783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.781795 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.883641 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.883673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.883682 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.883694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.883704 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.985423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.985459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.985469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.985480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:17 crc kubenswrapper[4735]: I1209 14:59:17.985490 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:17Z","lastTransitionTime":"2025-12-09T14:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.087411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.087451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.087462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.087476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.087489 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.189559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.189599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.189608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.189623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.189633 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.291289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.291319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.291329 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.291341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.291350 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.393239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.393268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.393278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.393290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.393298 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.413893 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:18 crc kubenswrapper[4735]: E1209 14:59:18.414010 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.494643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.494676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.494685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.494698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.494708 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.596271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.596303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.596312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.596325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.596333 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.698596 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.698646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.698655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.698667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.698678 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.800779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.800811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.800823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.800848 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.800856 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.903044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.903081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.903096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.903108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:18 crc kubenswrapper[4735]: I1209 14:59:18.903121 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:18Z","lastTransitionTime":"2025-12-09T14:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.004877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.004913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.004922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.004936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.004946 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.107127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.107162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.107170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.107181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.107190 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.208352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.208381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.208390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.208404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.208414 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.310666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.310688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.310695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.310705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.310717 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.412744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.412774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.412802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.412815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.412826 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.412969 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:19 crc kubenswrapper[4735]: E1209 14:59:19.413050 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.413072 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:19 crc kubenswrapper[4735]: E1209 14:59:19.413160 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.413234 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:19 crc kubenswrapper[4735]: E1209 14:59:19.413286 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.514961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.514993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.515019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.515034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.515045 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.616497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.616556 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.616567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.616579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.616588 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.718104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.718132 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.718141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.718151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.718158 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.820406 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.820465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.820475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.820485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.820492 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.922390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.922415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.922422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.922431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:19 crc kubenswrapper[4735]: I1209 14:59:19.922439 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:19Z","lastTransitionTime":"2025-12-09T14:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.024136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.024158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.024166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.024176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.024202 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.127028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.127068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.127078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.127091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.127100 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.229420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.229450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.229458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.229470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.229482 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.331777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.331806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.331818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.331847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.331857 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.413181 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:20 crc kubenswrapper[4735]: E1209 14:59:20.413305 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.434235 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.434258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.434268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.434282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.434295 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.443729 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:20 crc kubenswrapper[4735]: E1209 14:59:20.443904 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:20 crc kubenswrapper[4735]: E1209 14:59:20.444004 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs podName:6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:28.443974907 +0000 UTC m=+47.368813535 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs") pod "network-metrics-daemon-jw8pr" (UID: "6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.536272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.536334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.536349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.536371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.536386 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.637746 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.637782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.637797 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.637809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.637820 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.739765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.739804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.739815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.739831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.739854 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.842476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.842552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.842566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.842578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.842586 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.944450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.944489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.944499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.944532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:20 crc kubenswrapper[4735]: I1209 14:59:20.944544 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:20Z","lastTransitionTime":"2025-12-09T14:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.046394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.046427 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.046436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.046446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.046453 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.148611 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.148685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.148707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.148723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.148734 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.250258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.250319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.250331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.250343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.250354 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.351988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.352015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.352023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.352034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.352043 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.413619 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.413724 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.413723 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:21 crc kubenswrapper[4735]: E1209 14:59:21.413835 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:21 crc kubenswrapper[4735]: E1209 14:59:21.413914 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:21 crc kubenswrapper[4735]: E1209 14:59:21.414048 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.425208 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.433576 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.441287 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.450034 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.453659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.453691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.453701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.453714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.453724 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.461463 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.469188 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.478038 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.487423 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.496825 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.505751 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.514790 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.523057 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.536815 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.544620 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.551878 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.555461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.555497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.555507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.555551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.555566 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.560911 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.573571 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:08Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.16\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:08.248043 6137 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-controller per-node LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248056 6137 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-controller template LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248071 6137 services_controller.go:454] Service openshift-machine-config-operator/machine-config-controller for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1209 14:59:08.248070 6137 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:21Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.657659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.657798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.657874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.657941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.658014 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.759665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.759774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.759939 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.760088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.760155 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.861402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.861490 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.861594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.861665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.861717 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.962997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.963020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.963032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.963046 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:21 crc kubenswrapper[4735]: I1209 14:59:21.963057 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:21Z","lastTransitionTime":"2025-12-09T14:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.065100 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.065204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.065270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.065348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.065423 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.167691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.167732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.167742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.167761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.167773 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.269428 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.269462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.269472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.269485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.269493 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.371089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.371141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.371152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.371164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.371173 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.374922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.375066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.375129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.375189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.375249 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: E1209 14:59:22.384632 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.387285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.387339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.387349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.387361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.387369 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: E1209 14:59:22.398421 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.401542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.401573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.401583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.401595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.401603 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: E1209 14:59:22.409737 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.412886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.412913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.412924 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.412936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.412944 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.413006 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:22 crc kubenswrapper[4735]: E1209 14:59:22.413361 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.413612 4735 scope.go:117] "RemoveContainer" containerID="7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa" Dec 09 14:59:22 crc kubenswrapper[4735]: E1209 14:59:22.421324 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.423553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.423584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.423593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.423604 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.423613 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: E1209 14:59:22.431628 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: E1209 14:59:22.431743 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.473602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.473779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.473791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.473801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.473810 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.576353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.576405 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.576416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.576438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.576460 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.646149 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/1.log" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.649059 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.649211 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.678570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.678610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.678621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.678503 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:08Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.16\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:08.248043 6137 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-controller per-node LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248056 6137 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-controller template LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248071 6137 services_controller.go:454] Service openshift-machine-config-operator/machine-config-controller for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1209 14:59:08.248070 6137 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.678639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.678789 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.694119 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.707475 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.716557 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.733990 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.748548 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.756180 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.764926 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.774536 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.780735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.780768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.780778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.780794 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.780804 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.783401 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.791257 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.801023 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.811722 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.826498 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.835280 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.846084 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.854617 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:22Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.883029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.883069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.883081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.883110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.883124 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.985548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.985583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.985595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.985608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:22 crc kubenswrapper[4735]: I1209 14:59:22.985617 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:22Z","lastTransitionTime":"2025-12-09T14:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.087980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.088025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.088034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.088051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.088063 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.189978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.190018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.190028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.190052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.190065 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.292247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.292289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.292299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.292316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.292329 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.394344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.394377 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.394385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.394398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.394408 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.413890 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:23 crc kubenswrapper[4735]: E1209 14:59:23.413995 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.414228 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.414338 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:23 crc kubenswrapper[4735]: E1209 14:59:23.414495 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:23 crc kubenswrapper[4735]: E1209 14:59:23.414606 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.495915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.495953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.495966 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.495992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.496003 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.597455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.597487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.597498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.597526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.597536 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.652913 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/2.log" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.653444 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/1.log" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.655636 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a" exitCode=1 Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.655679 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a"} Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.655725 4735 scope.go:117] "RemoveContainer" containerID="7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.656230 4735 scope.go:117] "RemoveContainer" containerID="9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a" Dec 09 14:59:23 crc kubenswrapper[4735]: E1209 14:59:23.656361 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.665290 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.675077 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.683646 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.691708 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.698911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.698940 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.698950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.698960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.698970 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.699577 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.708590 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.714996 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.727861 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.736258 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.743602 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.751113 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.758857 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.766441 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.774994 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.781478 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.788265 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.799965 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bb9dc967e64824b899063cb7cbc1f780e2b5bc7a6a7922b8db1a8779d6e83aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:08Z\\\",\\\"message\\\":\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.16\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1209 14:59:08.248043 6137 services_controller.go:452] Built service openshift-machine-config-operator/machine-config-controller per-node LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248056 6137 services_controller.go:453] Built service openshift-machine-config-operator/machine-config-controller template LB for network=default: []services.LB{}\\\\nI1209 14:59:08.248071 6137 services_controller.go:454] Service openshift-machine-config-operator/machine-config-controller for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1209 14:59:08.248070 6137 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10257 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{kube-controller-manager: true,},ClusterIP:10.217.4.36,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.36],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1209 14:59:23.089913 6372 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:23.089949 6372 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager/kube-controller-manager for network=default ar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:23Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.800486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.800529 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.800541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.800554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.800563 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.902600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.902634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.902646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.902674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:23 crc kubenswrapper[4735]: I1209 14:59:23.902685 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:23Z","lastTransitionTime":"2025-12-09T14:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.004141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.004187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.004198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.004214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.004224 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.105442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.105476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.105487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.105499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.105507 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.207645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.207683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.207693 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.207707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.207717 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.309872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.309912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.309923 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.309940 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.309950 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.412168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.412195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.412204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.412216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.412227 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.413352 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:24 crc kubenswrapper[4735]: E1209 14:59:24.413454 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.513871 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.513901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.513909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.513923 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.513933 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.615664 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.615697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.615707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.615720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.615730 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.659504 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/2.log" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.717324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.717356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.717364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.717375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.717384 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.820116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.820149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.820159 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.820172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.820179 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.927610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.927645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.927654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.927669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:24 crc kubenswrapper[4735]: I1209 14:59:24.927678 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:24Z","lastTransitionTime":"2025-12-09T14:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.029239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.029263 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.029272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.029286 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.029295 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.131869 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.131902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.131913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.131923 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.131932 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.233984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.234015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.234024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.234038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.234049 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.336743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.336798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.336810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.336831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.336845 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.413510 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.413563 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.413577 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:25 crc kubenswrapper[4735]: E1209 14:59:25.413639 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:25 crc kubenswrapper[4735]: E1209 14:59:25.413734 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:25 crc kubenswrapper[4735]: E1209 14:59:25.413787 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.439005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.439053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.439068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.439085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.439100 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.541313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.541346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.541355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.541368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.541377 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.644006 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.644056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.644069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.644087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.644101 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.746279 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.746314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.746327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.746341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.746354 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.848736 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.848882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.848944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.849010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.849065 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.950890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.951012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.951094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.951156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:25 crc kubenswrapper[4735]: I1209 14:59:25.951209 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:25Z","lastTransitionTime":"2025-12-09T14:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.056198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.056562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.056634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.056717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.056781 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.159151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.159261 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.159332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.159394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.159459 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.261970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.262011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.262020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.262034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.262045 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.363589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.363618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.363629 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.363643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.363652 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.413617 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:26 crc kubenswrapper[4735]: E1209 14:59:26.413732 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.465464 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.465500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.465508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.465539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.465552 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.567197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.567228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.567237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.567248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.567257 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.676028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.676059 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.676070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.676084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.676093 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.777709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.777735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.777743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.777754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.777765 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.879598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.879637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.879646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.879657 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.879667 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.981500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.981565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.981578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.981594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:26 crc kubenswrapper[4735]: I1209 14:59:26.981604 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:26Z","lastTransitionTime":"2025-12-09T14:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.083261 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.083311 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.083320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.083334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.083342 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.185180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.185207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.185218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.185230 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.185239 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.286787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.286822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.286831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.286842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.286851 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.388000 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.388752 4735 scope.go:117] "RemoveContainer" containerID="9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a" Dec 09 14:59:27 crc kubenswrapper[4735]: E1209 14:59:27.388911 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.389697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.389756 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.389766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.389782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.389789 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.398929 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.407004 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.413731 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.413748 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.413759 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:27 crc kubenswrapper[4735]: E1209 14:59:27.413806 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:27 crc kubenswrapper[4735]: E1209 14:59:27.413877 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:27 crc kubenswrapper[4735]: E1209 14:59:27.413966 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.414779 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.427486 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10257 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{kube-controller-manager: true,},ClusterIP:10.217.4.36,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.36],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1209 14:59:23.089913 6372 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:23.089949 6372 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager/kube-controller-manager for network=default ar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.436591 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.446070 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.454014 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.462987 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.472694 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.479175 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.485893 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.491643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.491680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.491692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.491705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.491716 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.498027 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.507582 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.521039 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.529285 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.537603 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.544599 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:27Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.594644 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.594672 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.594691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.594707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.594717 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.698664 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.699500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.699593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.699669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.699726 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.802366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.802531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.802590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.802654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.802710 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.905235 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.905282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.905298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.905319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:27 crc kubenswrapper[4735]: I1209 14:59:27.905331 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:27Z","lastTransitionTime":"2025-12-09T14:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.007297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.007327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.007336 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.007348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.007356 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.109355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.109389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.109399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.109412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.109421 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.211451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.211482 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.211494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.211507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.211530 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.313767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.313810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.313819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.313835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.313849 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.413964 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:28 crc kubenswrapper[4735]: E1209 14:59:28.414209 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.415496 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.415545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.415555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.415574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.415590 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.512377 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:28 crc kubenswrapper[4735]: E1209 14:59:28.512585 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:28 crc kubenswrapper[4735]: E1209 14:59:28.512685 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs podName:6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2 nodeName:}" failed. No retries permitted until 2025-12-09 14:59:44.512658625 +0000 UTC m=+63.437497253 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs") pod "network-metrics-daemon-jw8pr" (UID: "6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.516925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.516962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.516974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.516994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.517006 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.618670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.618709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.618719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.618733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.618745 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.720676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.720718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.720726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.720737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.720745 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.822491 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.822537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.822546 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.822558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.822568 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.924602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.924633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.924643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.924654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:28 crc kubenswrapper[4735]: I1209 14:59:28.924664 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:28Z","lastTransitionTime":"2025-12-09T14:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.026756 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.026809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.026821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.026835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.026874 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.127983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.128018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.128028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.128042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.128051 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.229958 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.230074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.230085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.230096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.230107 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.331634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.331665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.331674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.331687 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.331699 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.413934 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.413968 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:29 crc kubenswrapper[4735]: E1209 14:59:29.414052 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.414118 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:29 crc kubenswrapper[4735]: E1209 14:59:29.414209 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:29 crc kubenswrapper[4735]: E1209 14:59:29.414258 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.433580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.433627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.433637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.433650 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.433657 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.535538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.535565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.535575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.535584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.535593 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.637179 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.637344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.637415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.637479 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.637563 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.738887 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.738912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.738920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.738931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.738939 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.840785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.840814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.840824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.840832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.840840 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.944858 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.944906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.944915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.944932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:29 crc kubenswrapper[4735]: I1209 14:59:29.944943 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:29Z","lastTransitionTime":"2025-12-09T14:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.048482 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.048724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.048734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.048749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.048760 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.150741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.150782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.150792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.150807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.150816 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.252328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.252457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.252549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.252614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.252674 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.354786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.354935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.355004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.355078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.355146 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.413844 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:30 crc kubenswrapper[4735]: E1209 14:59:30.413971 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.456978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.457015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.457025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.457039 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.457049 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.559356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.559588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.559598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.559610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.559620 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.661962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.662036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.662049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.662076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.662091 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.764260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.764293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.764303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.764317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.764328 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.865767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.865818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.865833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.865850 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.865866 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.967999 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.968042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.968055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.968072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:30 crc kubenswrapper[4735]: I1209 14:59:30.968083 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:30Z","lastTransitionTime":"2025-12-09T14:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.070119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.070160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.070172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.070186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.070196 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.172215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.172259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.172270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.172289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.172303 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.235343 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.235403 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.235426 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.235449 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.235465 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235601 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:03.235571 +0000 UTC m=+82.160409648 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235609 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235633 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235660 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235662 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235682 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235709 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 15:00:03.235686197 +0000 UTC m=+82.160524835 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235718 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235756 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235729 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 15:00:03.235719259 +0000 UTC m=+82.160557908 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235777 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235814 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 15:00:03.235780145 +0000 UTC m=+82.160618772 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.235852 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 15:00:03.235827874 +0000 UTC m=+82.160666502 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.273944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.273980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.273990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.274003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.274013 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.375759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.375791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.375799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.375810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.375821 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.413588 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.413658 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.413721 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.413843 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.413969 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:31 crc kubenswrapper[4735]: E1209 14:59:31.414063 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.424231 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.433974 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.442163 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.450653 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.457686 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.470585 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.477166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.477198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.477211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.477226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.477251 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.484283 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.490997 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.498718 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.511262 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10257 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{kube-controller-manager: true,},ClusterIP:10.217.4.36,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.36],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1209 14:59:23.089913 6372 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:23.089949 6372 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager/kube-controller-manager for network=default ar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.521435 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.530938 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.539955 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.549224 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.558968 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.566346 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.574141 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:31Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.579859 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.579919 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.579933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.579948 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.579959 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.682765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.682802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.682812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.682826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.682839 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.785067 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.785101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.785111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.785126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.785135 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.886492 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.886536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.886546 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.886560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.886568 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.988489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.988551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.988575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.988590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:31 crc kubenswrapper[4735]: I1209 14:59:31.988601 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:31Z","lastTransitionTime":"2025-12-09T14:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.090791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.090828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.090838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.090850 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.090859 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.192613 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.192635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.192643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.192653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.192660 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.294466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.294495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.294505 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.294531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.294540 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.320702 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.328910 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.332543 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.339593 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.346548 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.355846 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.367379 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.376549 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.384536 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.391683 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.396315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.396344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.396356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.396369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.396379 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.410496 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.413451 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:32 crc kubenswrapper[4735]: E1209 14:59:32.413628 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.419809 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.428024 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.436158 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.444040 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.451232 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.457263 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.463925 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.475403 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10257 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{kube-controller-manager: true,},ClusterIP:10.217.4.36,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.36],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1209 14:59:23.089913 6372 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:23.089949 6372 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager/kube-controller-manager for network=default ar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.500055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.500084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.500094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.500109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.500118 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.602316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.602348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.602358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.602371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.602382 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.704191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.704241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.704252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.704273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.704285 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.740153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.740242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.740310 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.740389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.740448 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: E1209 14:59:32.749413 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.752159 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.752199 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.752212 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.752227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.752237 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: E1209 14:59:32.760435 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.763247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.763279 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.763289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.763307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.763317 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: E1209 14:59:32.771251 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.773707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.773739 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.773750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.773762 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.773769 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: E1209 14:59:32.781435 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.783768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.783800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.783811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.783824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.783834 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: E1209 14:59:32.792744 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:32Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:32 crc kubenswrapper[4735]: E1209 14:59:32.792890 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.806075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.806107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.806119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.806134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.806143 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.909249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.909302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.909315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.909334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:32 crc kubenswrapper[4735]: I1209 14:59:32.909347 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:32Z","lastTransitionTime":"2025-12-09T14:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.011043 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.011076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.011085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.011096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.011108 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.113077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.113113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.113124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.113137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.113147 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.215536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.215564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.215573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.215583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.215591 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.318222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.318255 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.318264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.318278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.318288 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.413713 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.413789 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:33 crc kubenswrapper[4735]: E1209 14:59:33.413837 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.413720 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:33 crc kubenswrapper[4735]: E1209 14:59:33.413916 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:33 crc kubenswrapper[4735]: E1209 14:59:33.413965 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.420349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.420442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.420540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.420604 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.420683 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.523066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.523191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.523251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.523313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.523374 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.625708 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.625814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.625874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.625953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.626014 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.728122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.728160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.728173 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.728187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.728199 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.829771 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.829800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.829812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.829826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.829836 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.931796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.931822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.931830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.931840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:33 crc kubenswrapper[4735]: I1209 14:59:33.931851 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:33Z","lastTransitionTime":"2025-12-09T14:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.033873 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.033908 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.033918 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.033933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.033943 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.136005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.136042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.136074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.136087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.136097 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.237783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.237815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.237824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.237837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.237846 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.339268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.339308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.339319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.339333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.339346 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.413399 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:34 crc kubenswrapper[4735]: E1209 14:59:34.413497 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.441258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.441292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.441302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.441315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.441324 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.543305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.543334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.543343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.543353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.543361 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.645438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.645490 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.645506 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.645562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.645578 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.746954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.747008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.747020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.747036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.747047 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.848554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.848615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.848628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.848640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.848650 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.950942 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.950986 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.950999 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.951010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:34 crc kubenswrapper[4735]: I1209 14:59:34.951019 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:34Z","lastTransitionTime":"2025-12-09T14:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.052749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.052786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.052797 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.052811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.052823 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.154682 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.154715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.154727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.154739 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.154748 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.256696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.256731 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.256742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.256754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.256762 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.358221 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.358273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.358284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.358297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.358306 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.413343 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.413378 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:35 crc kubenswrapper[4735]: E1209 14:59:35.413480 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.413547 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:35 crc kubenswrapper[4735]: E1209 14:59:35.413655 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:35 crc kubenswrapper[4735]: E1209 14:59:35.413698 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.459804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.459913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.459972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.460050 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.460116 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.562026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.562125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.562189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.562260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.562318 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.664103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.664133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.664142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.664155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.664165 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.766065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.766092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.766101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.766113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.766121 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.867755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.867793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.867802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.867817 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.867827 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.969624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.969663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.969672 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.969684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:35 crc kubenswrapper[4735]: I1209 14:59:35.969692 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:35Z","lastTransitionTime":"2025-12-09T14:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.071371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.071398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.071407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.071417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.071426 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.173748 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.173798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.173808 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.173822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.173832 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.275195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.275224 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.275232 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.275244 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.275252 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.377099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.377146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.377155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.377168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.377176 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.413479 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:36 crc kubenswrapper[4735]: E1209 14:59:36.413611 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.478599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.478635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.478665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.478680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.478687 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.580425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.580455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.580463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.580473 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.580482 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.682305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.682360 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.682369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.682380 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.682390 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.783820 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.783868 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.783884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.783911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.783925 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.885752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.885779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.885786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.885799 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.885807 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.987712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.987747 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.987757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.987768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:36 crc kubenswrapper[4735]: I1209 14:59:36.987775 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:36Z","lastTransitionTime":"2025-12-09T14:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.089843 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.089870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.089878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.089899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.089907 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.191409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.191437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.191446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.191455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.191462 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.292880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.292908 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.292916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.292926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.292933 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.394041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.394177 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.394251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.394320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.394396 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.413550 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.413564 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:37 crc kubenswrapper[4735]: E1209 14:59:37.413806 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:37 crc kubenswrapper[4735]: E1209 14:59:37.413728 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.413583 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:37 crc kubenswrapper[4735]: E1209 14:59:37.413882 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.495691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.495831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.495908 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.495981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.496051 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.597755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.597782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.597790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.597801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.597809 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.700097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.700124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.700133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.700143 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.700150 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.802414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.802442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.802450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.802459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.802466 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.904462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.904691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.904765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.904829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:37 crc kubenswrapper[4735]: I1209 14:59:37.904902 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:37Z","lastTransitionTime":"2025-12-09T14:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.006156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.006193 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.006203 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.006213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.006222 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.108024 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.108053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.108062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.108073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.108081 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.210061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.210089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.210097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.210107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.210114 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.312094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.312137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.312147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.312159 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.312169 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.413058 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:38 crc kubenswrapper[4735]: E1209 14:59:38.413164 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.413735 4735 scope.go:117] "RemoveContainer" containerID="9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a" Dec 09 14:59:38 crc kubenswrapper[4735]: E1209 14:59:38.413880 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.414061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.414089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.414098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.414110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.414120 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.515585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.515609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.515618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.515628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.515635 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.617270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.617293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.617303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.617312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.617321 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.718470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.718525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.718536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.718549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.718559 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.820370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.820406 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.820415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.820436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.820448 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.922237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.922286 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.922295 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.922307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:38 crc kubenswrapper[4735]: I1209 14:59:38.922314 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:38Z","lastTransitionTime":"2025-12-09T14:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.023890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.023937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.023947 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.023964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.023973 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.125790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.125824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.125835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.125847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.125855 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.227555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.227589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.227600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.227615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.227626 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.328990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.329043 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.329053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.329065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.329074 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.413185 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:39 crc kubenswrapper[4735]: E1209 14:59:39.413267 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.413185 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:39 crc kubenswrapper[4735]: E1209 14:59:39.413374 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.413186 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:39 crc kubenswrapper[4735]: E1209 14:59:39.413487 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.430720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.430749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.430758 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.430770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.430779 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.532686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.532716 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.532724 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.532733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.532741 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.634146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.634169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.634180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.634190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.634198 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.736318 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.736366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.736378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.736392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.736404 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.838141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.838173 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.838182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.838195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.838204 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.940056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.940096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.940108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.940125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:39 crc kubenswrapper[4735]: I1209 14:59:39.940137 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:39Z","lastTransitionTime":"2025-12-09T14:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.041895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.041935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.041944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.041955 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.041964 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.143667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.143710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.143719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.143732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.143740 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.245794 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.245830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.245840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.245853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.245861 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.348147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.348181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.348192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.348210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.348221 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.413013 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:40 crc kubenswrapper[4735]: E1209 14:59:40.413126 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.449938 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.449969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.449976 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.450004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.450014 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.552790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.552818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.552828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.552849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.552859 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.655122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.655152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.655162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.655174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.655182 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.756789 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.756814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.756822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.756831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.756840 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.858679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.858712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.858720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.858734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.858742 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.960094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.960117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.960125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.960136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:40 crc kubenswrapper[4735]: I1209 14:59:40.960144 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:40Z","lastTransitionTime":"2025-12-09T14:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.062094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.062129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.062141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.062151 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.062161 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.163452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.163478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.163485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.163495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.163504 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.265320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.265359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.265368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.265377 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.265385 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.366436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.366461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.366470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.366478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.366485 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.413002 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.413003 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:41 crc kubenswrapper[4735]: E1209 14:59:41.413097 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.413120 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:41 crc kubenswrapper[4735]: E1209 14:59:41.413175 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:41 crc kubenswrapper[4735]: E1209 14:59:41.413245 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.422928 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.432062 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.440315 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.450486 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.460482 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.467750 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.468061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.468092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.468102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.468116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.468127 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.476219 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.484504 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.491825 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.499404 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.513350 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.521014 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.533694 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.540737 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.546681 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.553498 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.564792 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10257 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{kube-controller-manager: true,},ClusterIP:10.217.4.36,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.36],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1209 14:59:23.089913 6372 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:23.089949 6372 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager/kube-controller-manager for network=default ar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.570008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.570055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.570065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.570077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.570086 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.572188 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d84d0d2b-217a-408e-9e06-b081c2213fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46e14d22dd898b2543aa9e339ec2faab8dd457a5aee42ebd1948a9d12e73fc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45535b8bcb1404124d926d1d9c0ace548706f561f285fa378235ff9cac720b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a0407c8ce274843a0d89ba655e67a8f1a246ceea6869b4ca13bcfc26c8c0b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:41Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.671652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.671681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.671690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.671703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.671715 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.773398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.773439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.773450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.773465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.773477 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.874756 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.874793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.874821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.874837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.874846 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.976369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.976409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.976420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.976435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:41 crc kubenswrapper[4735]: I1209 14:59:41.976444 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:41Z","lastTransitionTime":"2025-12-09T14:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.077920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.077957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.077968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.077982 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.077992 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.179544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.179573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.179582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.179595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.179622 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.281497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.281560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.281571 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.281583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.281591 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.383081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.383110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.383118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.383129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.383138 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.413033 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:42 crc kubenswrapper[4735]: E1209 14:59:42.413156 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.485235 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.485268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.485276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.485290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.485300 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.586842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.586872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.586901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.586925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.586935 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.688586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.688608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.688617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.688626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.688649 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.790229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.790257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.790265 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.790276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.790283 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.836054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.836081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.836108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.836120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.836128 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: E1209 14:59:42.845766 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.848118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.848158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.848170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.848186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.848196 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: E1209 14:59:42.856268 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.858398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.858421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.858462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.858477 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.858486 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: E1209 14:59:42.865699 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.867715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.867736 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.867744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.867755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.867762 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: E1209 14:59:42.874850 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.876772 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.876792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.876801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.876809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.876817 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: E1209 14:59:42.884173 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:42Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:42 crc kubenswrapper[4735]: E1209 14:59:42.884272 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.892378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.892407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.892416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.892428 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.892435 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.994589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.994619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.994627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.994639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:42 crc kubenswrapper[4735]: I1209 14:59:42.994646 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:42Z","lastTransitionTime":"2025-12-09T14:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.096447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.096483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.096494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.096506 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.096540 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:43Z","lastTransitionTime":"2025-12-09T14:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.197812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.197842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.197851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.197863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.197873 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:43Z","lastTransitionTime":"2025-12-09T14:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.299603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.299653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.299662 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.299672 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.299679 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:43Z","lastTransitionTime":"2025-12-09T14:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.401738 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.401797 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.401807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.401824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.401831 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:43Z","lastTransitionTime":"2025-12-09T14:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.413331 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.413359 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:43 crc kubenswrapper[4735]: E1209 14:59:43.413434 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.413462 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:43 crc kubenswrapper[4735]: E1209 14:59:43.413541 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:43 crc kubenswrapper[4735]: E1209 14:59:43.413602 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.503429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.503465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.503476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.503490 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.503501 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:43Z","lastTransitionTime":"2025-12-09T14:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.605610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.605712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.605723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.605735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.605743 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:43Z","lastTransitionTime":"2025-12-09T14:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.706826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.706853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.706863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.706874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.706883 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:43Z","lastTransitionTime":"2025-12-09T14:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.808600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.808718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.808780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.808854 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.808922 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:43Z","lastTransitionTime":"2025-12-09T14:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.910465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.910497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.910505 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.910533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:43 crc kubenswrapper[4735]: I1209 14:59:43.910543 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:43Z","lastTransitionTime":"2025-12-09T14:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.012479 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.012578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.012590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.012600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.012612 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:44Z","lastTransitionTime":"2025-12-09T14:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.114503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.114558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.114568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.114581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.114590 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:44Z","lastTransitionTime":"2025-12-09T14:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.215657 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.215677 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.215685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.215694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.215702 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:44Z","lastTransitionTime":"2025-12-09T14:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.318935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.318983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.318993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.319008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.319019 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:44Z","lastTransitionTime":"2025-12-09T14:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.413835 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:44 crc kubenswrapper[4735]: E1209 14:59:44.413945 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.420316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.420348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.420357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.420366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.420373 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:44Z","lastTransitionTime":"2025-12-09T14:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.522507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.522568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.522578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.522593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.522602 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:44Z","lastTransitionTime":"2025-12-09T14:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.540878 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:44 crc kubenswrapper[4735]: E1209 14:59:44.541041 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:44 crc kubenswrapper[4735]: E1209 14:59:44.541101 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs podName:6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2 nodeName:}" failed. No retries permitted until 2025-12-09 15:00:16.541087026 +0000 UTC m=+95.465925645 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs") pod "network-metrics-daemon-jw8pr" (UID: "6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.624197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.624233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.624241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.624272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.624324 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:44Z","lastTransitionTime":"2025-12-09T14:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.726420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.726452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.726463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.726476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.726486 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:44Z","lastTransitionTime":"2025-12-09T14:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.828431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.828460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.828468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.828480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.828488 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:44Z","lastTransitionTime":"2025-12-09T14:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.930095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.930126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.930135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.930145 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:44 crc kubenswrapper[4735]: I1209 14:59:44.930156 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:44Z","lastTransitionTime":"2025-12-09T14:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.032051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.032107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.032120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.032134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.032145 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:45Z","lastTransitionTime":"2025-12-09T14:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.134436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.134475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.134484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.134501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.134524 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:45Z","lastTransitionTime":"2025-12-09T14:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.236285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.236323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.236334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.236348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.236358 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:45Z","lastTransitionTime":"2025-12-09T14:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.338184 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.338225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.338234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.338250 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.338262 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:45Z","lastTransitionTime":"2025-12-09T14:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.413944 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.414007 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:45 crc kubenswrapper[4735]: E1209 14:59:45.414037 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.413944 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:45 crc kubenswrapper[4735]: E1209 14:59:45.414119 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:45 crc kubenswrapper[4735]: E1209 14:59:45.414160 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.440213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.440252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.440261 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.440272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.440281 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:45Z","lastTransitionTime":"2025-12-09T14:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.542438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.542543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.542558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.542589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.542608 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:45Z","lastTransitionTime":"2025-12-09T14:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.645453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.645525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.645538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.645549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.645559 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:45Z","lastTransitionTime":"2025-12-09T14:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.719682 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnf8f_67d17a09-b547-49cf-8195-5af12413f51c/kube-multus/0.log" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.719734 4735 generic.go:334] "Generic (PLEG): container finished" podID="67d17a09-b547-49cf-8195-5af12413f51c" containerID="8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18" exitCode=1 Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.719768 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnf8f" event={"ID":"67d17a09-b547-49cf-8195-5af12413f51c","Type":"ContainerDied","Data":"8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18"} Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.720242 4735 scope.go:117] "RemoveContainer" containerID="8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.735710 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.745256 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.748173 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.748205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.748215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.748229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.748239 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:45Z","lastTransitionTime":"2025-12-09T14:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.755964 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.766995 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.779739 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.786807 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.794577 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.803495 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.814135 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.824192 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.833235 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:45Z\\\",\\\"message\\\":\\\"2025-12-09T14:59:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78\\\\n2025-12-09T14:59:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78 to /host/opt/cni/bin/\\\\n2025-12-09T14:59:00Z [verbose] multus-daemon started\\\\n2025-12-09T14:59:00Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.842147 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.850653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.850689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.850701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.850718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.850729 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:45Z","lastTransitionTime":"2025-12-09T14:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.855147 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.864029 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.871530 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.879799 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.892837 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10257 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{kube-controller-manager: true,},ClusterIP:10.217.4.36,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.36],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1209 14:59:23.089913 6372 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:23.089949 6372 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager/kube-controller-manager for network=default ar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.902567 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d84d0d2b-217a-408e-9e06-b081c2213fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46e14d22dd898b2543aa9e339ec2faab8dd457a5aee42ebd1948a9d12e73fc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45535b8bcb1404124d926d1d9c0ace548706f561f285fa378235ff9cac720b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a0407c8ce274843a0d89ba655e67a8f1a246ceea6869b4ca13bcfc26c8c0b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:45Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.953768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.953805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.953816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.953836 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:45 crc kubenswrapper[4735]: I1209 14:59:45.953848 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:45Z","lastTransitionTime":"2025-12-09T14:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.056137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.056208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.056226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.056254 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.056270 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:46Z","lastTransitionTime":"2025-12-09T14:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.158853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.158890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.158903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.158929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.158941 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:46Z","lastTransitionTime":"2025-12-09T14:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.260971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.261010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.261019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.261036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.261047 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:46Z","lastTransitionTime":"2025-12-09T14:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.363129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.363171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.363181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.363199 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.363214 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:46Z","lastTransitionTime":"2025-12-09T14:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.413816 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:46 crc kubenswrapper[4735]: E1209 14:59:46.413965 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.465116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.465153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.465163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.465176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.465187 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:46Z","lastTransitionTime":"2025-12-09T14:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.566907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.566968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.566978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.566990 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.567002 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:46Z","lastTransitionTime":"2025-12-09T14:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.669531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.669561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.669570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.669585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.669596 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:46Z","lastTransitionTime":"2025-12-09T14:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.725809 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnf8f_67d17a09-b547-49cf-8195-5af12413f51c/kube-multus/0.log" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.725873 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnf8f" event={"ID":"67d17a09-b547-49cf-8195-5af12413f51c","Type":"ContainerStarted","Data":"70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed"} Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.737502 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.753381 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10257 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{kube-controller-manager: true,},ClusterIP:10.217.4.36,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.36],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1209 14:59:23.089913 6372 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:23.089949 6372 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager/kube-controller-manager for network=default ar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.764281 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d84d0d2b-217a-408e-9e06-b081c2213fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46e14d22dd898b2543aa9e339ec2faab8dd457a5aee42ebd1948a9d12e73fc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45535b8bcb1404124d926d1d9c0ace548706f561f285fa378235ff9cac720b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a0407c8ce274843a0d89ba655e67a8f1a246ceea6869b4ca13bcfc26c8c0b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.771699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.771731 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.771744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.771761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.771773 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:46Z","lastTransitionTime":"2025-12-09T14:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.774626 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.783162 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.792746 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.802782 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.813755 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.821633 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.829742 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.839648 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.849329 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.859241 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.868573 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:45Z\\\",\\\"message\\\":\\\"2025-12-09T14:59:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78\\\\n2025-12-09T14:59:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78 to /host/opt/cni/bin/\\\\n2025-12-09T14:59:00Z [verbose] multus-daemon started\\\\n2025-12-09T14:59:00Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.873917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.873968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.873981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.873997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.874008 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:46Z","lastTransitionTime":"2025-12-09T14:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.876611 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.890055 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.899351 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.907738 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:46Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.976575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.976621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.976635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.976654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:46 crc kubenswrapper[4735]: I1209 14:59:46.976669 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:46Z","lastTransitionTime":"2025-12-09T14:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.078572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.078619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.078631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.078646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.078657 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:47Z","lastTransitionTime":"2025-12-09T14:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.180317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.180349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.180358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.180371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.180379 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:47Z","lastTransitionTime":"2025-12-09T14:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.282860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.282898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.282911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.282935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.282950 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:47Z","lastTransitionTime":"2025-12-09T14:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.385973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.386012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.386022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.386039 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.386046 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:47Z","lastTransitionTime":"2025-12-09T14:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.413369 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.413413 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:47 crc kubenswrapper[4735]: E1209 14:59:47.413494 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.413606 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:47 crc kubenswrapper[4735]: E1209 14:59:47.413685 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:47 crc kubenswrapper[4735]: E1209 14:59:47.413820 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.488306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.488353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.488365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.488392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.488407 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:47Z","lastTransitionTime":"2025-12-09T14:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.590378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.590399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.590407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.590419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.590431 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:47Z","lastTransitionTime":"2025-12-09T14:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.691870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.691898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.691909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.691921 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.691938 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:47Z","lastTransitionTime":"2025-12-09T14:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.794277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.794317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.794331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.794349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.794360 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:47Z","lastTransitionTime":"2025-12-09T14:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.896807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.896838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.896849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.896863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.896873 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:47Z","lastTransitionTime":"2025-12-09T14:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.999371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.999423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.999434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.999449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:47 crc kubenswrapper[4735]: I1209 14:59:47.999465 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:47Z","lastTransitionTime":"2025-12-09T14:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.102376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.102405 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.102414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.102424 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.102433 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:48Z","lastTransitionTime":"2025-12-09T14:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.204564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.204597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.204608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.204620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.204630 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:48Z","lastTransitionTime":"2025-12-09T14:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.307044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.307089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.307100 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.307114 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.307123 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:48Z","lastTransitionTime":"2025-12-09T14:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.409412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.409481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.409492 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.409553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.409568 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:48Z","lastTransitionTime":"2025-12-09T14:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.413872 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:48 crc kubenswrapper[4735]: E1209 14:59:48.413993 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.512306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.512362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.512373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.512391 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.512404 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:48Z","lastTransitionTime":"2025-12-09T14:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.613805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.613848 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.613859 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.613873 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.613883 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:48Z","lastTransitionTime":"2025-12-09T14:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.715533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.715579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.715591 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.715608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.715620 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:48Z","lastTransitionTime":"2025-12-09T14:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.817178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.817216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.817226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.817242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.817254 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:48Z","lastTransitionTime":"2025-12-09T14:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.919552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.919580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.919590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.919599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:48 crc kubenswrapper[4735]: I1209 14:59:48.919609 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:48Z","lastTransitionTime":"2025-12-09T14:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.021693 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.021718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.021726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.021737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.021750 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:49Z","lastTransitionTime":"2025-12-09T14:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.124017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.124057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.124070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.124089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.124102 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:49Z","lastTransitionTime":"2025-12-09T14:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.226365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.226567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.226655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.226733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.226808 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:49Z","lastTransitionTime":"2025-12-09T14:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.329346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.329379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.329389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.329402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.329412 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:49Z","lastTransitionTime":"2025-12-09T14:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.413109 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:49 crc kubenswrapper[4735]: E1209 14:59:49.413242 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.413295 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.413421 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:49 crc kubenswrapper[4735]: E1209 14:59:49.413428 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:49 crc kubenswrapper[4735]: E1209 14:59:49.413757 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.431138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.431177 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.431190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.431207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.431219 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:49Z","lastTransitionTime":"2025-12-09T14:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.533317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.533366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.533377 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.533387 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.533397 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:49Z","lastTransitionTime":"2025-12-09T14:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.635115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.635170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.635182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.635205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.635221 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:49Z","lastTransitionTime":"2025-12-09T14:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.736802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.736840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.736855 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.736869 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.736882 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:49Z","lastTransitionTime":"2025-12-09T14:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.839029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.839069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.839079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.839092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.839102 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:49Z","lastTransitionTime":"2025-12-09T14:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.941468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.941508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.941536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.941553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:49 crc kubenswrapper[4735]: I1209 14:59:49.941565 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:49Z","lastTransitionTime":"2025-12-09T14:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.043393 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.043443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.043456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.043475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.043484 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:50Z","lastTransitionTime":"2025-12-09T14:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.145425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.145594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.145683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.145766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.145828 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:50Z","lastTransitionTime":"2025-12-09T14:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.248287 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.248325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.248335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.248351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.248363 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:50Z","lastTransitionTime":"2025-12-09T14:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.350314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.350353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.350366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.350379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.350391 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:50Z","lastTransitionTime":"2025-12-09T14:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.413070 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:50 crc kubenswrapper[4735]: E1209 14:59:50.413265 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.453810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.453847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.453860 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.453876 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.453905 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:50Z","lastTransitionTime":"2025-12-09T14:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.557425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.557462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.557472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.557485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.557496 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:50Z","lastTransitionTime":"2025-12-09T14:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.659561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.659605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.659617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.659636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.659647 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:50Z","lastTransitionTime":"2025-12-09T14:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.761231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.761264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.761273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.761289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.761298 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:50Z","lastTransitionTime":"2025-12-09T14:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.862596 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.862627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.862640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.862656 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.862668 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:50Z","lastTransitionTime":"2025-12-09T14:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.964606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.964677 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.964689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.964713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:50 crc kubenswrapper[4735]: I1209 14:59:50.964727 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:50Z","lastTransitionTime":"2025-12-09T14:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.066384 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.066423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.066434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.066445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.066457 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:51Z","lastTransitionTime":"2025-12-09T14:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.168616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.168656 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.168669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.168683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.168692 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:51Z","lastTransitionTime":"2025-12-09T14:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.271695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.271729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.271741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.271760 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.271770 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:51Z","lastTransitionTime":"2025-12-09T14:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.374501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.374559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.374568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.374580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.374590 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:51Z","lastTransitionTime":"2025-12-09T14:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.414036 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.414778 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.414820 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:51 crc kubenswrapper[4735]: E1209 14:59:51.414851 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:51 crc kubenswrapper[4735]: E1209 14:59:51.414947 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:51 crc kubenswrapper[4735]: E1209 14:59:51.415288 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.415661 4735 scope.go:117] "RemoveContainer" containerID="9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.430075 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.443720 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.455112 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.471888 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.477341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.477369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.477378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.477391 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.477400 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:51Z","lastTransitionTime":"2025-12-09T14:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.480713 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.490389 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.497325 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.510111 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.519085 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.528706 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.537346 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.547160 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:45Z\\\",\\\"message\\\":\\\"2025-12-09T14:59:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78\\\\n2025-12-09T14:59:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78 to /host/opt/cni/bin/\\\\n2025-12-09T14:59:00Z [verbose] multus-daemon started\\\\n2025-12-09T14:59:00Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.555800 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.564421 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d84d0d2b-217a-408e-9e06-b081c2213fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46e14d22dd898b2543aa9e339ec2faab8dd457a5aee42ebd1948a9d12e73fc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45535b8bcb1404124d926d1d9c0ace548706f561f285fa378235ff9cac720b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a0407c8ce274843a0d89ba655e67a8f1a246ceea6869b4ca13bcfc26c8c0b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.572865 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.580270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.580307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.580321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.580340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.580354 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:51Z","lastTransitionTime":"2025-12-09T14:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.581435 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.590883 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.609078 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10257 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{kube-controller-manager: true,},ClusterIP:10.217.4.36,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.36],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1209 14:59:23.089913 6372 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:23.089949 6372 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager/kube-controller-manager for network=default ar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.683325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.683365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.683377 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.683398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.683415 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:51Z","lastTransitionTime":"2025-12-09T14:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.743794 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/2.log" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.747589 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6"} Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.748812 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.771235 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.782820 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.785233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.785265 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.785275 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.785288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.785298 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:51Z","lastTransitionTime":"2025-12-09T14:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.805345 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10257 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{kube-controller-manager: true,},ClusterIP:10.217.4.36,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.36],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1209 14:59:23.089913 6372 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:23.089949 6372 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager/kube-controller-manager for network=default ar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.815826 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d84d0d2b-217a-408e-9e06-b081c2213fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46e14d22dd898b2543aa9e339ec2faab8dd457a5aee42ebd1948a9d12e73fc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45535b8bcb1404124d926d1d9c0ace548706f561f285fa378235ff9cac720b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a0407c8ce274843a0d89ba655e67a8f1a246ceea6869b4ca13bcfc26c8c0b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.825032 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.837498 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.846394 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.855541 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.872163 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.881176 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.888035 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.888080 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.888093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.888108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.888120 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:51Z","lastTransitionTime":"2025-12-09T14:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.888988 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.899677 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.911472 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.923076 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.934326 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:45Z\\\",\\\"message\\\":\\\"2025-12-09T14:59:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78\\\\n2025-12-09T14:59:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78 to /host/opt/cni/bin/\\\\n2025-12-09T14:59:00Z [verbose] multus-daemon started\\\\n2025-12-09T14:59:00Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.945718 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.960069 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.969007 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:51Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.990891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.990961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.990974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.990992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:51 crc kubenswrapper[4735]: I1209 14:59:51.991302 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:51Z","lastTransitionTime":"2025-12-09T14:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.093807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.093852 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.093863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.093880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.093891 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:52Z","lastTransitionTime":"2025-12-09T14:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.196446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.196485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.196496 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.196527 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.196538 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:52Z","lastTransitionTime":"2025-12-09T14:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.298962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.298997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.299007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.299020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.299032 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:52Z","lastTransitionTime":"2025-12-09T14:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.400722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.400776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.400785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.400803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.400817 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:52Z","lastTransitionTime":"2025-12-09T14:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.413106 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:52 crc kubenswrapper[4735]: E1209 14:59:52.413221 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.503601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.503630 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.503639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.503674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.503692 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:52Z","lastTransitionTime":"2025-12-09T14:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.605832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.605877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.605896 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.605917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.605931 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:52Z","lastTransitionTime":"2025-12-09T14:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.708786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.708846 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.708857 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.708877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.708904 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:52Z","lastTransitionTime":"2025-12-09T14:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.751815 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/3.log" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.752627 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/2.log" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.755088 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" exitCode=1 Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.755126 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6"} Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.755161 4735 scope.go:117] "RemoveContainer" containerID="9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.755807 4735 scope.go:117] "RemoveContainer" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 14:59:52 crc kubenswrapper[4735]: E1209 14:59:52.755951 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.779597 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.791349 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.799975 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:45Z\\\",\\\"message\\\":\\\"2025-12-09T14:59:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78\\\\n2025-12-09T14:59:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78 to /host/opt/cni/bin/\\\\n2025-12-09T14:59:00Z [verbose] multus-daemon started\\\\n2025-12-09T14:59:00Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.809758 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.811405 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.811447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.811457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.811475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.811487 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:52Z","lastTransitionTime":"2025-12-09T14:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.825764 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.835424 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.847190 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.855915 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.869201 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fa741889020600f66466cdcd6101b7ab30b4210882a2b980df41e3715d8020a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:23Z\\\",\\\"message\\\":\\\"Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 10257 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{kube-controller-manager: true,},ClusterIP:10.217.4.36,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.36],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1209 14:59:23.089913 6372 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:23.089949 6372 lb_config.go:1031] Cluster endpoints for openshift-kube-controller-manager/kube-controller-manager for network=default ar\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:22Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:52Z\\\",\\\"message\\\":\\\"ifecycle-manager/catalog-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.204:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {78f6184b-c7cf-436d-8cbb-4b31f8af75e8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:52.173538 6790 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1209 14:59:52.173540 6790 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1209 14:59:52.173546 6790 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.879806 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d84d0d2b-217a-408e-9e06-b081c2213fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46e14d22dd898b2543aa9e339ec2faab8dd457a5aee42ebd1948a9d12e73fc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45535b8bcb1404124d926d1d9c0ace548706f561f285fa378235ff9cac720b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a0407c8ce274843a0d89ba655e67a8f1a246ceea6869b4ca13bcfc26c8c0b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.900149 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.911789 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.913092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.913122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.913133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.913150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.913161 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:52Z","lastTransitionTime":"2025-12-09T14:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.922058 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.932704 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.943145 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.950733 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.959000 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:52 crc kubenswrapper[4735]: I1209 14:59:52.970329 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:52Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.022764 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.022812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.022824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.022843 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.022856 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.028851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.028929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.028943 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.028966 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.028980 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: E1209 14:59:53.039091 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.042213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.042268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.042281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.042304 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.042315 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: E1209 14:59:53.052288 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.055213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.055249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.055262 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.055277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.055288 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: E1209 14:59:53.065307 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.068470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.068504 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.068530 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.068544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.068554 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: E1209 14:59:53.077800 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.080415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.080537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.080552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.080569 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.080580 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: E1209 14:59:53.089791 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: E1209 14:59:53.089916 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.124383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.124408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.124417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.124428 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.124437 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.226823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.226849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.226858 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.226883 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.226895 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.328430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.328459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.328469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.328484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.328497 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.413167 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.413271 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:53 crc kubenswrapper[4735]: E1209 14:59:53.413386 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.413574 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:53 crc kubenswrapper[4735]: E1209 14:59:53.413645 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:53 crc kubenswrapper[4735]: E1209 14:59:53.413790 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.430124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.430156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.430165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.430177 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.430187 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.531298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.531324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.531335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.531350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.531360 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.633649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.634045 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.634130 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.634206 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.634450 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.737392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.737438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.737452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.737479 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.737493 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.760120 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/3.log" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.765004 4735 scope.go:117] "RemoveContainer" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 14:59:53 crc kubenswrapper[4735]: E1209 14:59:53.765161 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.774654 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.788875 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.810778 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:52Z\\\",\\\"message\\\":\\\"ifecycle-manager/catalog-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.204:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {78f6184b-c7cf-436d-8cbb-4b31f8af75e8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:52.173538 6790 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1209 14:59:52.173540 6790 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1209 14:59:52.173546 6790 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.825008 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d84d0d2b-217a-408e-9e06-b081c2213fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46e14d22dd898b2543aa9e339ec2faab8dd457a5aee42ebd1948a9d12e73fc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45535b8bcb1404124d926d1d9c0ace548706f561f285fa378235ff9cac720b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a0407c8ce274843a0d89ba655e67a8f1a246ceea6869b4ca13bcfc26c8c0b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.836561 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.839632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.839655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.839665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.839678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.839690 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.848016 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.857944 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.867088 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.879811 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.887929 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.895779 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.904949 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.913162 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.921419 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.936647 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:45Z\\\",\\\"message\\\":\\\"2025-12-09T14:59:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78\\\\n2025-12-09T14:59:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78 to /host/opt/cni/bin/\\\\n2025-12-09T14:59:00Z [verbose] multus-daemon started\\\\n2025-12-09T14:59:00Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.941928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.941989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.941999 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.942016 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.942026 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:53Z","lastTransitionTime":"2025-12-09T14:59:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.944250 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.957006 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:53 crc kubenswrapper[4735]: I1209 14:59:53.965923 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T14:59:53Z is after 2025-08-24T17:21:41Z" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.043572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.043610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.043623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.043641 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.043655 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:54Z","lastTransitionTime":"2025-12-09T14:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.145760 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.145787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.145798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.145815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.145827 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:54Z","lastTransitionTime":"2025-12-09T14:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.247719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.247744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.247754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.247768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.247779 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:54Z","lastTransitionTime":"2025-12-09T14:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.350008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.350069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.350085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.350104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.350119 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:54Z","lastTransitionTime":"2025-12-09T14:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.413623 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:54 crc kubenswrapper[4735]: E1209 14:59:54.413742 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.452434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.452464 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.452493 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.452507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.452534 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:54Z","lastTransitionTime":"2025-12-09T14:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.554636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.554666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.554677 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.554692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.554703 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:54Z","lastTransitionTime":"2025-12-09T14:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.656609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.656648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.656659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.656674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.656684 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:54Z","lastTransitionTime":"2025-12-09T14:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.758103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.758142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.758155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.758168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.758177 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:54Z","lastTransitionTime":"2025-12-09T14:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.860058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.860087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.860099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.860111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.860120 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:54Z","lastTransitionTime":"2025-12-09T14:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.961814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.961856 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.961867 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.961877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:54 crc kubenswrapper[4735]: I1209 14:59:54.961885 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:54Z","lastTransitionTime":"2025-12-09T14:59:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.064306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.064350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.064362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.064376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.064388 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:55Z","lastTransitionTime":"2025-12-09T14:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.165610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.165635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.165643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.165654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.165664 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:55Z","lastTransitionTime":"2025-12-09T14:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.267541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.267573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.267582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.267594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.267604 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:55Z","lastTransitionTime":"2025-12-09T14:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.369086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.369115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.369125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.369137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.369145 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:55Z","lastTransitionTime":"2025-12-09T14:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.413745 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.413768 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:55 crc kubenswrapper[4735]: E1209 14:59:55.413861 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.413934 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:55 crc kubenswrapper[4735]: E1209 14:59:55.413975 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:55 crc kubenswrapper[4735]: E1209 14:59:55.414110 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.470697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.470729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.470739 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.470751 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.470761 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:55Z","lastTransitionTime":"2025-12-09T14:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.573014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.573052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.573069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.573091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.573107 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:55Z","lastTransitionTime":"2025-12-09T14:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.675374 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.675406 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.675435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.675449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.675456 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:55Z","lastTransitionTime":"2025-12-09T14:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.777460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.777500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.777530 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.777542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.777553 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:55Z","lastTransitionTime":"2025-12-09T14:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.879600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.879625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.879634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.879646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.879654 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:55Z","lastTransitionTime":"2025-12-09T14:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.981851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.981885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.981894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.981903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:55 crc kubenswrapper[4735]: I1209 14:59:55.981910 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:55Z","lastTransitionTime":"2025-12-09T14:59:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.083841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.083883 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.083896 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.083910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.083919 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:56Z","lastTransitionTime":"2025-12-09T14:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.186466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.186538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.186550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.186577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.186593 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:56Z","lastTransitionTime":"2025-12-09T14:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.288968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.288993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.289001 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.289012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.289019 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:56Z","lastTransitionTime":"2025-12-09T14:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.391234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.391282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.391294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.391306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.391316 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:56Z","lastTransitionTime":"2025-12-09T14:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.413627 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:56 crc kubenswrapper[4735]: E1209 14:59:56.413836 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.492555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.492595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.492624 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.492636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.492647 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:56Z","lastTransitionTime":"2025-12-09T14:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.594685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.594839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.594915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.594997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.595067 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:56Z","lastTransitionTime":"2025-12-09T14:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.696539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.696587 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.696599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.696619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.696633 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:56Z","lastTransitionTime":"2025-12-09T14:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.798383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.798417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.798431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.798445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.798454 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:56Z","lastTransitionTime":"2025-12-09T14:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.899874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.899912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.899924 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.899937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:56 crc kubenswrapper[4735]: I1209 14:59:56.899948 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:56Z","lastTransitionTime":"2025-12-09T14:59:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.001853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.001884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.001893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.001906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.001916 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:57Z","lastTransitionTime":"2025-12-09T14:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.103889 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.103957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.103972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.103998 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.104016 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:57Z","lastTransitionTime":"2025-12-09T14:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.205885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.205921 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.205932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.205943 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.205953 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:57Z","lastTransitionTime":"2025-12-09T14:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.308180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.308216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.308228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.308241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.308249 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:57Z","lastTransitionTime":"2025-12-09T14:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.410209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.410271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.410284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.410299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.410310 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:57Z","lastTransitionTime":"2025-12-09T14:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.413595 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.413601 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:57 crc kubenswrapper[4735]: E1209 14:59:57.413775 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.413611 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:57 crc kubenswrapper[4735]: E1209 14:59:57.413863 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:57 crc kubenswrapper[4735]: E1209 14:59:57.413967 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.512051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.512084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.512093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.512108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.512116 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:57Z","lastTransitionTime":"2025-12-09T14:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.614068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.614109 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.614118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.614131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.614141 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:57Z","lastTransitionTime":"2025-12-09T14:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.716722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.716786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.716795 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.716815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.716828 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:57Z","lastTransitionTime":"2025-12-09T14:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.818768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.818814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.818824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.818838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.818849 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:57Z","lastTransitionTime":"2025-12-09T14:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.920770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.920817 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.920830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.920845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:57 crc kubenswrapper[4735]: I1209 14:59:57.920855 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:57Z","lastTransitionTime":"2025-12-09T14:59:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.022671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.022729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.022742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.022772 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.022783 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:58Z","lastTransitionTime":"2025-12-09T14:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.124431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.124459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.124469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.124481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.124492 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:58Z","lastTransitionTime":"2025-12-09T14:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.226573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.226619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.226628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.226646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.226657 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:58Z","lastTransitionTime":"2025-12-09T14:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.328922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.328970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.328980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.328991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.329000 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:58Z","lastTransitionTime":"2025-12-09T14:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.413779 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 14:59:58 crc kubenswrapper[4735]: E1209 14:59:58.413943 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.431445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.431559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.431637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.431710 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.431776 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:58Z","lastTransitionTime":"2025-12-09T14:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.533276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.533335 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.533345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.533359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.533369 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:58Z","lastTransitionTime":"2025-12-09T14:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.634828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.634868 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.634882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.634897 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.634908 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:58Z","lastTransitionTime":"2025-12-09T14:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.736781 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.736840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.736850 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.736863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.736872 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:58Z","lastTransitionTime":"2025-12-09T14:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.839130 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.839256 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.839327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.839400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.839454 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:58Z","lastTransitionTime":"2025-12-09T14:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.941631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.941666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.941694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.941717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:58 crc kubenswrapper[4735]: I1209 14:59:58.941736 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:58Z","lastTransitionTime":"2025-12-09T14:59:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.043901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.044207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.044272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.044350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.044409 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:59Z","lastTransitionTime":"2025-12-09T14:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.146845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.146975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.147074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.147154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.147213 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:59Z","lastTransitionTime":"2025-12-09T14:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.248946 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.249000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.249013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.249026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.249038 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:59Z","lastTransitionTime":"2025-12-09T14:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.350985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.351213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.351323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.351398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.351468 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:59Z","lastTransitionTime":"2025-12-09T14:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.413572 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.413622 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.413672 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 14:59:59 crc kubenswrapper[4735]: E1209 14:59:59.413825 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 14:59:59 crc kubenswrapper[4735]: E1209 14:59:59.413968 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 14:59:59 crc kubenswrapper[4735]: E1209 14:59:59.414079 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.453396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.453444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.453459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.453482 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.453497 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:59Z","lastTransitionTime":"2025-12-09T14:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.555968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.556116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.556186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.556245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.556318 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:59Z","lastTransitionTime":"2025-12-09T14:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.657978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.658031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.658045 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.658063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.658074 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:59Z","lastTransitionTime":"2025-12-09T14:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.759754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.760102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.760167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.760243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.760312 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:59Z","lastTransitionTime":"2025-12-09T14:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.862142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.862185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.862196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.862216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.862228 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:59Z","lastTransitionTime":"2025-12-09T14:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.964950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.965000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.965010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.965062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 14:59:59 crc kubenswrapper[4735]: I1209 14:59:59.965075 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T14:59:59Z","lastTransitionTime":"2025-12-09T14:59:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.067408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.067586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.067662 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.067767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.067831 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:00Z","lastTransitionTime":"2025-12-09T15:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.169468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.169503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.169528 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.169545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.169554 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:00Z","lastTransitionTime":"2025-12-09T15:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.271646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.271671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.271679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.271705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.271718 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:00Z","lastTransitionTime":"2025-12-09T15:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.374021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.374064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.374078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.374097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.374114 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:00Z","lastTransitionTime":"2025-12-09T15:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.413632 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:00 crc kubenswrapper[4735]: E1209 15:00:00.413842 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.476322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.476364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.476374 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.476392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.476402 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:00Z","lastTransitionTime":"2025-12-09T15:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.579313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.579352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.579365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.579381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.579392 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:00Z","lastTransitionTime":"2025-12-09T15:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.681922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.681954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.681964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.681976 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.681985 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:00Z","lastTransitionTime":"2025-12-09T15:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.783928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.783961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.783971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.783983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.783995 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:00Z","lastTransitionTime":"2025-12-09T15:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.885449 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.885487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.885550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.885567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.885578 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:00Z","lastTransitionTime":"2025-12-09T15:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.987821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.987844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.987853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.987865 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:00 crc kubenswrapper[4735]: I1209 15:00:00.987874 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:00Z","lastTransitionTime":"2025-12-09T15:00:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.090214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.090250 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.090259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.090273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.090285 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:01Z","lastTransitionTime":"2025-12-09T15:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.191792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.191821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.191830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.191840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.191847 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:01Z","lastTransitionTime":"2025-12-09T15:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.294091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.294118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.294127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.294137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.294147 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:01Z","lastTransitionTime":"2025-12-09T15:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.397155 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.397208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.397219 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.397239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.397253 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:01Z","lastTransitionTime":"2025-12-09T15:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.413441 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.413502 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:01 crc kubenswrapper[4735]: E1209 15:00:01.413609 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.413638 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:01 crc kubenswrapper[4735]: E1209 15:00:01.413746 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:01 crc kubenswrapper[4735]: E1209 15:00:01.413882 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.423754 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.436551 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.453571 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:52Z\\\",\\\"message\\\":\\\"ifecycle-manager/catalog-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.204:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {78f6184b-c7cf-436d-8cbb-4b31f8af75e8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:52.173538 6790 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1209 14:59:52.173540 6790 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1209 14:59:52.173546 6790 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.462746 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d84d0d2b-217a-408e-9e06-b081c2213fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46e14d22dd898b2543aa9e339ec2faab8dd457a5aee42ebd1948a9d12e73fc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45535b8bcb1404124d926d1d9c0ace548706f561f285fa378235ff9cac720b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a0407c8ce274843a0d89ba655e67a8f1a246ceea6869b4ca13bcfc26c8c0b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.471795 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.482861 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.494418 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.499407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.499445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.499458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.499476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.499489 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:01Z","lastTransitionTime":"2025-12-09T15:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.503699 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.513795 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.521227 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.528465 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.538358 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.546942 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.555913 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.565469 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:45Z\\\",\\\"message\\\":\\\"2025-12-09T14:59:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78\\\\n2025-12-09T14:59:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78 to /host/opt/cni/bin/\\\\n2025-12-09T14:59:00Z [verbose] multus-daemon started\\\\n2025-12-09T14:59:00Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.574026 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.588580 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.598036 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:01Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.601459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.601487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.601498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.601528 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.601538 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:01Z","lastTransitionTime":"2025-12-09T15:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.703790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.703839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.703851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.703874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.703891 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:01Z","lastTransitionTime":"2025-12-09T15:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.805783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.805833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.805845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.805863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.805875 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:01Z","lastTransitionTime":"2025-12-09T15:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.908467 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.908504 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.908538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.908554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:01 crc kubenswrapper[4735]: I1209 15:00:01.908565 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:01Z","lastTransitionTime":"2025-12-09T15:00:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.013349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.013540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.013568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.013590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.013604 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:02Z","lastTransitionTime":"2025-12-09T15:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.120575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.120618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.120629 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.120658 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.120673 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:02Z","lastTransitionTime":"2025-12-09T15:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.223700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.223741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.223752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.223767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.223776 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:02Z","lastTransitionTime":"2025-12-09T15:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.326287 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.326336 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.326347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.326371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.326385 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:02Z","lastTransitionTime":"2025-12-09T15:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.413278 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:02 crc kubenswrapper[4735]: E1209 15:00:02.413614 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.428855 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.428898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.428908 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.428925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.428938 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:02Z","lastTransitionTime":"2025-12-09T15:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.531054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.531093 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.531105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.531117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.531127 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:02Z","lastTransitionTime":"2025-12-09T15:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.633395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.633443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.633455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.633475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.633487 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:02Z","lastTransitionTime":"2025-12-09T15:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.735706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.735759 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.735771 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.735790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.735803 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:02Z","lastTransitionTime":"2025-12-09T15:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.838069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.838112 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.838122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.838137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.838147 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:02Z","lastTransitionTime":"2025-12-09T15:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.939951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.939984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.939994 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.940007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:02 crc kubenswrapper[4735]: I1209 15:00:02.940016 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:02Z","lastTransitionTime":"2025-12-09T15:00:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.042166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.042204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.042213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.042225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.042234 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.144351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.144388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.144401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.144413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.144422 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.246388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.246435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.246445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.246465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.246477 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.307165 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.307266 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307295 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:07.307266579 +0000 UTC m=+146.232105208 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.307325 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.307367 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307397 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.307401 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307437 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 15:01:07.307427548 +0000 UTC m=+146.232266177 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307490 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307497 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307557 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-09 15:01:07.307547191 +0000 UTC m=+146.232385829 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307557 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307589 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307636 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-09 15:01:07.30761652 +0000 UTC m=+146.232455148 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307728 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307770 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307787 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.307859 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-09 15:01:07.307838452 +0000 UTC m=+146.232677079 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.348606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.348645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.348655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.348673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.348687 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.413102 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.413171 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.413239 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.413295 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.413430 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.413578 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.450700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.450755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.450771 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.450794 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.450808 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.468708 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.468740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.468751 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.468765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.468779 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.480594 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:03Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.484602 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.484648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.484676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.484694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.484704 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.496169 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:03Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.499429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.499494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.499505 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.499542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.499559 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.509290 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:03Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.512459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.512490 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.512503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.512533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.512548 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.521003 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:03Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.523507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.523566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.523574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.523591 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.523603 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.533059 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-09T15:00:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2ea56a57-18d8-4f3a-8391-c192c3891ec8\\\",\\\"systemUUID\\\":\\\"ddaa7a2d-0e19-463b-a4d7-3ee14e7916ea\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:03Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:03 crc kubenswrapper[4735]: E1209 15:00:03.533183 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.553416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.553545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.553639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.553911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.553976 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.662773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.662807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.662816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.662828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.662837 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.764719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.764846 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.765067 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.765128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.765193 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.867021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.867084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.867095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.867118 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.867138 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.969455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.969498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.969507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.969547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:03 crc kubenswrapper[4735]: I1209 15:00:03.969556 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:03Z","lastTransitionTime":"2025-12-09T15:00:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.071773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.071803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.071814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.071824 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.071835 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:04Z","lastTransitionTime":"2025-12-09T15:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.174053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.174078 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.174087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.174097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.174106 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:04Z","lastTransitionTime":"2025-12-09T15:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.275987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.276017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.276026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.276036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.276045 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:04Z","lastTransitionTime":"2025-12-09T15:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.378160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.378185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.378195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.378205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.378214 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:04Z","lastTransitionTime":"2025-12-09T15:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.413772 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:04 crc kubenswrapper[4735]: E1209 15:00:04.413899 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.481048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.481121 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.481133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.481161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.481180 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:04Z","lastTransitionTime":"2025-12-09T15:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.583684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.583733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.583745 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.583763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.583776 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:04Z","lastTransitionTime":"2025-12-09T15:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.685541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.685597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.685607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.685621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.685632 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:04Z","lastTransitionTime":"2025-12-09T15:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.787616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.787649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.787661 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.787674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.787686 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:04Z","lastTransitionTime":"2025-12-09T15:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.889436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.889471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.889480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.889508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.889536 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:04Z","lastTransitionTime":"2025-12-09T15:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.991336 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.991371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.991380 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.991392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:04 crc kubenswrapper[4735]: I1209 15:00:04.991403 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:04Z","lastTransitionTime":"2025-12-09T15:00:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.093249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.093283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.093294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.093306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.093314 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:05Z","lastTransitionTime":"2025-12-09T15:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.195447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.195507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.195541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.195564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.195588 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:05Z","lastTransitionTime":"2025-12-09T15:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.297672 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.297722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.297734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.297751 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.297768 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:05Z","lastTransitionTime":"2025-12-09T15:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.399733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.399793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.399808 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.399839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.399852 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:05Z","lastTransitionTime":"2025-12-09T15:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.413155 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.413248 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.413717 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:05 crc kubenswrapper[4735]: E1209 15:00:05.413651 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:05 crc kubenswrapper[4735]: E1209 15:00:05.413928 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:05 crc kubenswrapper[4735]: E1209 15:00:05.413967 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.414788 4735 scope.go:117] "RemoveContainer" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 15:00:05 crc kubenswrapper[4735]: E1209 15:00:05.414971 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.501658 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.501701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.501715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.501730 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.501742 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:05Z","lastTransitionTime":"2025-12-09T15:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.603918 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.603953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.603963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.603977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.603990 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:05Z","lastTransitionTime":"2025-12-09T15:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.706069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.706114 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.706125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.706140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.706150 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:05Z","lastTransitionTime":"2025-12-09T15:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.808029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.808071 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.808081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.808095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.808105 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:05Z","lastTransitionTime":"2025-12-09T15:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.910271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.910326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.910338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.910357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:05 crc kubenswrapper[4735]: I1209 15:00:05.910386 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:05Z","lastTransitionTime":"2025-12-09T15:00:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.012334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.012371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.012381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.012396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.012405 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:06Z","lastTransitionTime":"2025-12-09T15:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.114486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.114581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.114593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.114614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.114626 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:06Z","lastTransitionTime":"2025-12-09T15:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.216610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.216663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.216675 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.216696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.216710 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:06Z","lastTransitionTime":"2025-12-09T15:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.318471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.318507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.318540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.318562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.318574 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:06Z","lastTransitionTime":"2025-12-09T15:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.413198 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:06 crc kubenswrapper[4735]: E1209 15:00:06.413346 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.420611 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.420651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.420667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.420717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.420732 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:06Z","lastTransitionTime":"2025-12-09T15:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.523429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.523476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.523487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.523508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.523543 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:06Z","lastTransitionTime":"2025-12-09T15:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.625193 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.625344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.625413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.625503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.625624 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:06Z","lastTransitionTime":"2025-12-09T15:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.727661 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.727698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.727709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.727726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.727735 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:06Z","lastTransitionTime":"2025-12-09T15:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.829842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.829879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.829890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.829902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.829911 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:06Z","lastTransitionTime":"2025-12-09T15:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.932462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.932503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.932528 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.932560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:06 crc kubenswrapper[4735]: I1209 15:00:06.932571 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:06Z","lastTransitionTime":"2025-12-09T15:00:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.035246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.035292 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.035303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.035317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.035329 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:07Z","lastTransitionTime":"2025-12-09T15:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.137825 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.137859 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.137869 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.137882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.137891 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:07Z","lastTransitionTime":"2025-12-09T15:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.240381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.240430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.240442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.240461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.240472 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:07Z","lastTransitionTime":"2025-12-09T15:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.343959 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.344029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.344044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.344066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.344080 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:07Z","lastTransitionTime":"2025-12-09T15:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.413743 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.413802 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.413814 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:07 crc kubenswrapper[4735]: E1209 15:00:07.414205 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:07 crc kubenswrapper[4735]: E1209 15:00:07.414275 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:07 crc kubenswrapper[4735]: E1209 15:00:07.414048 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.447031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.447072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.447083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.447101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.447115 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:07Z","lastTransitionTime":"2025-12-09T15:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.549829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.549874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.549884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.549899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.549912 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:07Z","lastTransitionTime":"2025-12-09T15:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.652445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.652492 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.652501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.652541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.652560 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:07Z","lastTransitionTime":"2025-12-09T15:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.755015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.755043 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.755052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.755066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.755076 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:07Z","lastTransitionTime":"2025-12-09T15:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.857312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.857353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.857361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.857386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.857397 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:07Z","lastTransitionTime":"2025-12-09T15:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.960087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.960147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.960158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.960180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:07 crc kubenswrapper[4735]: I1209 15:00:07.960194 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:07Z","lastTransitionTime":"2025-12-09T15:00:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.062158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.062204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.062215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.062231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.062243 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:08Z","lastTransitionTime":"2025-12-09T15:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.164590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.164643 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.164659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.164673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.164684 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:08Z","lastTransitionTime":"2025-12-09T15:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.266902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.266941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.266950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.266966 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.266974 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:08Z","lastTransitionTime":"2025-12-09T15:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.368800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.368839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.368849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.368866 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.368875 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:08Z","lastTransitionTime":"2025-12-09T15:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.413582 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:08 crc kubenswrapper[4735]: E1209 15:00:08.413740 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.470469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.470541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.470552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.470570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.470583 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:08Z","lastTransitionTime":"2025-12-09T15:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.572828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.572882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.572897 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.572916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.572929 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:08Z","lastTransitionTime":"2025-12-09T15:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.675466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.675535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.675547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.675564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.675575 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:08Z","lastTransitionTime":"2025-12-09T15:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.777580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.777625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.777636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.777653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.777663 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:08Z","lastTransitionTime":"2025-12-09T15:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.879385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.879450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.879461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.879479 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.879501 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:08Z","lastTransitionTime":"2025-12-09T15:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.982015 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.982059 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.982069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.982085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:08 crc kubenswrapper[4735]: I1209 15:00:08.982096 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:08Z","lastTransitionTime":"2025-12-09T15:00:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.084877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.084936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.084948 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.084972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.084985 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:09Z","lastTransitionTime":"2025-12-09T15:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.186607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.186645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.186655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.186669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.186681 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:09Z","lastTransitionTime":"2025-12-09T15:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.288615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.288674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.288684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.288697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.288705 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:09Z","lastTransitionTime":"2025-12-09T15:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.391105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.391144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.391153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.391167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.391180 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:09Z","lastTransitionTime":"2025-12-09T15:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.413591 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.413608 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.413665 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:09 crc kubenswrapper[4735]: E1209 15:00:09.413706 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:09 crc kubenswrapper[4735]: E1209 15:00:09.413762 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:09 crc kubenswrapper[4735]: E1209 15:00:09.414033 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.493431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.493453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.493463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.493504 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.493530 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:09Z","lastTransitionTime":"2025-12-09T15:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.595369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.595443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.595453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.595464 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.595480 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:09Z","lastTransitionTime":"2025-12-09T15:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.697413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.697438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.697448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.697460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.697490 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:09Z","lastTransitionTime":"2025-12-09T15:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.799258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.799303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.799313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.799328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.799338 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:09Z","lastTransitionTime":"2025-12-09T15:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.901591 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.901645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.901654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.901667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:09 crc kubenswrapper[4735]: I1209 15:00:09.901678 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:09Z","lastTransitionTime":"2025-12-09T15:00:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.003708 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.003741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.003750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.003762 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.003770 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:10Z","lastTransitionTime":"2025-12-09T15:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.105665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.105693 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.105702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.105713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.105722 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:10Z","lastTransitionTime":"2025-12-09T15:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.207853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.207889 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.207918 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.207931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.207941 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:10Z","lastTransitionTime":"2025-12-09T15:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.310068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.310141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.310163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.310194 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.310211 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:10Z","lastTransitionTime":"2025-12-09T15:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.412322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.412368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.412379 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.412394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.412404 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:10Z","lastTransitionTime":"2025-12-09T15:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.413562 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:10 crc kubenswrapper[4735]: E1209 15:00:10.413685 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.514633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.514657 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.514667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.514679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.514687 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:10Z","lastTransitionTime":"2025-12-09T15:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.616806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.616839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.616851 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.616863 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.616871 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:10Z","lastTransitionTime":"2025-12-09T15:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.718636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.718680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.718692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.718711 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.718722 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:10Z","lastTransitionTime":"2025-12-09T15:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.820333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.820378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.820386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.820402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.820417 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:10Z","lastTransitionTime":"2025-12-09T15:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.922269 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.922323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.922333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.922346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:10 crc kubenswrapper[4735]: I1209 15:00:10.922357 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:10Z","lastTransitionTime":"2025-12-09T15:00:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.024054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.024089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.024097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.024108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.024137 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:11Z","lastTransitionTime":"2025-12-09T15:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.126018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.126050 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.126060 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.126072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.126084 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:11Z","lastTransitionTime":"2025-12-09T15:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.227721 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.227753 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.227763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.227773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.227783 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:11Z","lastTransitionTime":"2025-12-09T15:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.329648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.329706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.329717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.329741 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.329754 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:11Z","lastTransitionTime":"2025-12-09T15:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.413805 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.413967 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:11 crc kubenswrapper[4735]: E1209 15:00:11.414062 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.414142 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:11 crc kubenswrapper[4735]: E1209 15:00:11.414202 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:11 crc kubenswrapper[4735]: E1209 15:00:11.414325 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.422459 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.426203 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x5f7x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9edd7b0-a112-42be-b351-018a9f9c68e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9cb8a7ecef011d3684158e535a0e520129e78deb65d6c70425b788b63f129025\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnv8t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x5f7x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.431224 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.431262 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.431271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.431285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.431298 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:11Z","lastTransitionTime":"2025-12-09T15:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.434133 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwlvf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:12Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.444264 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3857fdec-9ac6-41b1-9504-8c256da10835\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-09T14:58:58Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1209 14:58:53.460942 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1209 14:58:53.463422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4108989207/tls.crt::/tmp/serving-cert-4108989207/tls.key\\\\\\\"\\\\nI1209 14:58:58.841507 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1209 14:58:58.844808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1209 14:58:58.844829 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1209 14:58:58.844850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1209 14:58:58.844855 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1209 14:58:58.849102 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1209 14:58:58.849133 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849137 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1209 14:58:58.849141 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1209 14:58:58.849145 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1209 14:58:58.849148 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1209 14:58:58.849151 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1209 14:58:58.849327 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1209 14:58:58.850628 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.453562 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af6136a5cbe70af509d3abc20b9566f1b2bda757d0450084504a0d17f7f43f98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.462352 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.470035 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.479685 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ddf2068-c88d-46fd-97ac-eba38d91c642\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e8a836b18aa4d3c9a2d2fd8c45ff02d8e9d9bf07e50e961d48a6233b5bdc7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://705087b8bf99cbb8be235e461be0f42704d6ba5c68220324be146f2a578e4a53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef598621f12e5a448692df5468cc793673e6b6b9f818086e670e245469afadba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e918fbb0a2b363efa10845b5ddd658e06977f7fba127b3b5c6dfc98874438ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498d1f2b576193cbd26827f6a3d3db96765e601f7945938f535f37997590ea7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be1f654e6a16abd8fa599633e0e4c7393a0359f5c3ffca93e1ce78a8a7952a91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f382213adb2b9fa59ba6e141f466830e581003fbbce202da5330dc75b1b28a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:59:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qvmkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.492996 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e495b8c7-63da-4e41-bb2c-8abf1eae06fb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b27aef9501d4eeca604a2f9d5ee826b47f3d8f399536e39592d3fad01469494\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60743bd389e13802d33ced71cca8733b3228378f9613a7d19410b49e76fcc019\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://330ecef511678c68d41222dd9def72bb0e96a85b94dddd0d5bdd7290eda2ee77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e5d6af9f89ed15d9d06ba568f310ed0967f14837534bde2148a870e02b6ff4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://540bdc99458e375a8cca092d5450e4083765dae786bb2b0c72f9dec7846cb50c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a387fef858c23405e1f3fc35540cd698d352ca90972caffcaf55fafab02b70a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e11c6386c9059c0a0ebc6bc49ed6b34514cb420fde92989fe387f6ad2fbbc5de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9769aa4f013df159f5672cd8e542d6ddb307d87c408b95e79a3066d00f1e4a20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.500891 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b73bc01-ba54-4ddc-9884-98ab69ccfb68\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd2d9f721caaff326aa5197b1c26a9b500b16da2a6a33a1054a0a0d13f68bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9bd8f5edac7fbc02fff60b5285b5603f5a579029225e81a103e8adaf19878c6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ad4ad958b58f7912701b1e715f4543c84545a5a79c72a8d4897f9a7de8b5a7f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.508266 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.516106 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef6a6fe62c6f763f371925de3877ce266e8670a4668fc32f6d7817263848eefd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7df47357d07ef09ef86e2d64bc106523a4b3d69e0e48eb1ed056f7deca9f0501\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.524272 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xnf8f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67d17a09-b547-49cf-8195-5af12413f51c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:45Z\\\",\\\"message\\\":\\\"2025-12-09T14:59:00+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78\\\\n2025-12-09T14:59:00+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e38dc6d9-c112-434a-8f9f-bf1416c0ad78 to /host/opt/cni/bin/\\\\n2025-12-09T14:59:00Z [verbose] multus-daemon started\\\\n2025-12-09T14:59:00Z [verbose] Readiness Indicator file check\\\\n2025-12-09T14:59:45Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gxwb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xnf8f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.530880 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62a4496-0a3d-4e9f-a70a-7cf318f07dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5522879f43e747a6d454ebae31795e04f0b37a6a9d7252ead49aeb16d019c459\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5871f66a74961fb65f971c85b0aac8641b459baa3bea586d857dac5091fd1786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-flmb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:59:11Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-h2gtw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.533298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.533333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.533346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.533361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.533372 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:11Z","lastTransitionTime":"2025-12-09T15:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.538333 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d84d0d2b-217a-408e-9e06-b081c2213fca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46e14d22dd898b2543aa9e339ec2faab8dd457a5aee42ebd1948a9d12e73fc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45535b8bcb1404124d926d1d9c0ace548706f561f285fa378235ff9cac720b69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20a0407c8ce274843a0d89ba655e67a8f1a246ceea6869b4ca13bcfc26c8c0b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9961b4071b853878d42f69dffb9c389c9fab5ff8e52bd120002b10822df1426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.545235 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38830b387149c01f2be9ef8204fefd3fdaf0eb34a20368f3078ea3e4e673c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.551192 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qhfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9617623e-09bb-4eb1-9b58-025df7afa461\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f5f859278496c9399ffcc12c1fd42c54b2c359d3585a2232c9fa3db437f30e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdgvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qhfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.557661 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9700326d-c8d3-42a5-8521-b0fab6ca8ffe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e77aa77ac4c8a0c6c20c08a5efbeb02b9db5a41b2a41f5bfe93e77c35d7682f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4wmq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t5lmh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.569330 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9374566a-4662-4e98-ae18-6f52468332b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-09T14:59:52Z\\\",\\\"message\\\":\\\"ifecycle-manager/catalog-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.204:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {78f6184b-c7cf-436d-8cbb-4b31f8af75e8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1209 14:59:52.173538 6790 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1209 14:59:52.173540 6790 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1209 14:59:52.173546 6790 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-09T14:59:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-09T14:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-09T14:58:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-09T14:58:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fn6dw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-09T14:58:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qblcd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-09T15:00:11Z is after 2025-08-24T17:21:41Z" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.635651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.635686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.635695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.635715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.635736 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:11Z","lastTransitionTime":"2025-12-09T15:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.737184 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.737220 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.737231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.737246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.737258 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:11Z","lastTransitionTime":"2025-12-09T15:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.838789 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.838827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.838837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.838849 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.838861 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:11Z","lastTransitionTime":"2025-12-09T15:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.941248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.941290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.941303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.941317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:11 crc kubenswrapper[4735]: I1209 15:00:11.941326 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:11Z","lastTransitionTime":"2025-12-09T15:00:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.042668 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.042719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.042729 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.042742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.042753 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:12Z","lastTransitionTime":"2025-12-09T15:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.144476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.144600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.144615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.144634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.144649 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:12Z","lastTransitionTime":"2025-12-09T15:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.246832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.246888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.246899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.246917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.246927 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:12Z","lastTransitionTime":"2025-12-09T15:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.348782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.348816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.348827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.348841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.348851 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:12Z","lastTransitionTime":"2025-12-09T15:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.413261 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:12 crc kubenswrapper[4735]: E1209 15:00:12.413355 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.450969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.450996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.451005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.451038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.451049 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:12Z","lastTransitionTime":"2025-12-09T15:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.553500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.553554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.553564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.553575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.553587 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:12Z","lastTransitionTime":"2025-12-09T15:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.655800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.655832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.655841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.655853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.655864 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:12Z","lastTransitionTime":"2025-12-09T15:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.757731 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.757756 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.757766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.757777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.757787 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:12Z","lastTransitionTime":"2025-12-09T15:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.860054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.860090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.860102 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.860116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.860126 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:12Z","lastTransitionTime":"2025-12-09T15:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.961563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.961592 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.961600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.961610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:12 crc kubenswrapper[4735]: I1209 15:00:12.961617 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:12Z","lastTransitionTime":"2025-12-09T15:00:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.063083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.063106 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.063116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.063126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.063135 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:13Z","lastTransitionTime":"2025-12-09T15:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.165312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.165342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.165351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.165362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.165370 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:13Z","lastTransitionTime":"2025-12-09T15:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.267618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.267653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.267664 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.267678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.267690 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:13Z","lastTransitionTime":"2025-12-09T15:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.369133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.369161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.369173 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.369187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.369196 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:13Z","lastTransitionTime":"2025-12-09T15:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.413810 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.413830 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:13 crc kubenswrapper[4735]: E1209 15:00:13.413915 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.413996 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:13 crc kubenswrapper[4735]: E1209 15:00:13.414040 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:13 crc kubenswrapper[4735]: E1209 15:00:13.414148 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.470471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.470502 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.470534 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.470548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.470557 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:13Z","lastTransitionTime":"2025-12-09T15:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.572736 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.572765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.572775 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.572787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.572795 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:13Z","lastTransitionTime":"2025-12-09T15:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.674478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.674501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.674526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.674538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.674547 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:13Z","lastTransitionTime":"2025-12-09T15:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.776787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.776815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.776825 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.776855 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.776866 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:13Z","lastTransitionTime":"2025-12-09T15:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.878272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.878312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.878323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.878338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.878348 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:13Z","lastTransitionTime":"2025-12-09T15:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.915542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.915571 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.915582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.915595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.915604 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-09T15:00:13Z","lastTransitionTime":"2025-12-09T15:00:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.947402 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95"] Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.948019 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.949263 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.949494 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.950018 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.950659 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.971257 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x5f7x" podStartSLOduration=75.971244709 podStartE2EDuration="1m15.971244709s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:13.971060638 +0000 UTC m=+92.895899267" watchObservedRunningTime="2025-12-09 15:00:13.971244709 +0000 UTC m=+92.896083338" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.983539 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.983505385 podStartE2EDuration="2.983505385s" podCreationTimestamp="2025-12-09 15:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:13.983387606 +0000 UTC m=+92.908226234" watchObservedRunningTime="2025-12-09 15:00:13.983505385 +0000 UTC m=+92.908344013" Dec 09 15:00:13 crc kubenswrapper[4735]: I1209 15:00:13.993577 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=74.993569382 podStartE2EDuration="1m14.993569382s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:13.993479696 +0000 UTC m=+92.918318315" watchObservedRunningTime="2025-12-09 15:00:13.993569382 +0000 UTC m=+92.918408011" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.029113 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qvmkc" podStartSLOduration=76.029090056 podStartE2EDuration="1m16.029090056s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:14.028746508 +0000 UTC m=+92.953585136" watchObservedRunningTime="2025-12-09 15:00:14.029090056 +0000 UTC m=+92.953928694" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.073015 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=75.072997745 podStartE2EDuration="1m15.072997745s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:14.071445113 +0000 UTC m=+92.996283741" watchObservedRunningTime="2025-12-09 15:00:14.072997745 +0000 UTC m=+92.997836373" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.085927 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.085912854 podStartE2EDuration="1m15.085912854s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:14.085851299 +0000 UTC m=+93.010689927" watchObservedRunningTime="2025-12-09 15:00:14.085912854 +0000 UTC m=+93.010751482" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.103402 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7cc7d0-712b-4304-8bed-987abbc5efa9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.103593 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/aa7cc7d0-712b-4304-8bed-987abbc5efa9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.103744 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7cc7d0-712b-4304-8bed-987abbc5efa9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.103834 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/aa7cc7d0-712b-4304-8bed-987abbc5efa9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.103932 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa7cc7d0-712b-4304-8bed-987abbc5efa9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.113340 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xnf8f" podStartSLOduration=76.113328826 podStartE2EDuration="1m16.113328826s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:14.113021506 +0000 UTC m=+93.037860134" watchObservedRunningTime="2025-12-09 15:00:14.113328826 +0000 UTC m=+93.038167454" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.121598 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-h2gtw" podStartSLOduration=75.121573236 podStartE2EDuration="1m15.121573236s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:14.1212304 +0000 UTC m=+93.046069029" watchObservedRunningTime="2025-12-09 15:00:14.121573236 +0000 UTC m=+93.046411865" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.131532 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.131496914 podStartE2EDuration="42.131496914s" podCreationTimestamp="2025-12-09 14:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:14.130965779 +0000 UTC m=+93.055804397" watchObservedRunningTime="2025-12-09 15:00:14.131496914 +0000 UTC m=+93.056335542" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.146684 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7qhfd" podStartSLOduration=76.146676125 podStartE2EDuration="1m16.146676125s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:14.146148807 +0000 UTC m=+93.070987435" watchObservedRunningTime="2025-12-09 15:00:14.146676125 +0000 UTC m=+93.071514754" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.153276 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podStartSLOduration=76.153261558 podStartE2EDuration="1m16.153261558s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:14.152921196 +0000 UTC m=+93.077759825" watchObservedRunningTime="2025-12-09 15:00:14.153261558 +0000 UTC m=+93.078100186" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.205452 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa7cc7d0-712b-4304-8bed-987abbc5efa9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.205498 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7cc7d0-712b-4304-8bed-987abbc5efa9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.205538 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/aa7cc7d0-712b-4304-8bed-987abbc5efa9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.205607 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7cc7d0-712b-4304-8bed-987abbc5efa9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.205632 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/aa7cc7d0-712b-4304-8bed-987abbc5efa9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.205692 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/aa7cc7d0-712b-4304-8bed-987abbc5efa9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.205714 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/aa7cc7d0-712b-4304-8bed-987abbc5efa9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.206454 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa7cc7d0-712b-4304-8bed-987abbc5efa9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.211140 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7cc7d0-712b-4304-8bed-987abbc5efa9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.217088 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa7cc7d0-712b-4304-8bed-987abbc5efa9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wrc95\" (UID: \"aa7cc7d0-712b-4304-8bed-987abbc5efa9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.262036 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.413088 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:14 crc kubenswrapper[4735]: E1209 15:00:14.413204 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.821963 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" event={"ID":"aa7cc7d0-712b-4304-8bed-987abbc5efa9","Type":"ContainerStarted","Data":"3d85df52f951de2f905ab8e491eb151b1fc3e789d70f453811995138e9239a7e"} Dec 09 15:00:14 crc kubenswrapper[4735]: I1209 15:00:14.822017 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" event={"ID":"aa7cc7d0-712b-4304-8bed-987abbc5efa9","Type":"ContainerStarted","Data":"0842ac52aadd1cf1b2fa279db51bb2266fff45723f3603f5d38a07b83111fd1b"} Dec 09 15:00:15 crc kubenswrapper[4735]: I1209 15:00:15.413033 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:15 crc kubenswrapper[4735]: I1209 15:00:15.413101 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:15 crc kubenswrapper[4735]: E1209 15:00:15.413148 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:15 crc kubenswrapper[4735]: E1209 15:00:15.413297 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:15 crc kubenswrapper[4735]: I1209 15:00:15.413347 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:15 crc kubenswrapper[4735]: E1209 15:00:15.413449 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:16 crc kubenswrapper[4735]: I1209 15:00:16.413653 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:16 crc kubenswrapper[4735]: E1209 15:00:16.413819 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:16 crc kubenswrapper[4735]: I1209 15:00:16.626130 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:16 crc kubenswrapper[4735]: E1209 15:00:16.626263 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 15:00:16 crc kubenswrapper[4735]: E1209 15:00:16.626348 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs podName:6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2 nodeName:}" failed. No retries permitted until 2025-12-09 15:01:20.626330748 +0000 UTC m=+159.551169376 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs") pod "network-metrics-daemon-jw8pr" (UID: "6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 09 15:00:17 crc kubenswrapper[4735]: I1209 15:00:17.413539 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:17 crc kubenswrapper[4735]: E1209 15:00:17.413657 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:17 crc kubenswrapper[4735]: I1209 15:00:17.413694 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:17 crc kubenswrapper[4735]: I1209 15:00:17.413757 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:17 crc kubenswrapper[4735]: E1209 15:00:17.413790 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:17 crc kubenswrapper[4735]: E1209 15:00:17.413865 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:18 crc kubenswrapper[4735]: I1209 15:00:18.413408 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:18 crc kubenswrapper[4735]: E1209 15:00:18.413706 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:19 crc kubenswrapper[4735]: I1209 15:00:19.413990 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:19 crc kubenswrapper[4735]: I1209 15:00:19.414100 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:19 crc kubenswrapper[4735]: E1209 15:00:19.414106 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:19 crc kubenswrapper[4735]: I1209 15:00:19.414140 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:19 crc kubenswrapper[4735]: E1209 15:00:19.414419 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:19 crc kubenswrapper[4735]: E1209 15:00:19.414483 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:19 crc kubenswrapper[4735]: I1209 15:00:19.414822 4735 scope.go:117] "RemoveContainer" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 15:00:19 crc kubenswrapper[4735]: E1209 15:00:19.414957 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" Dec 09 15:00:20 crc kubenswrapper[4735]: I1209 15:00:20.412967 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:20 crc kubenswrapper[4735]: E1209 15:00:20.413079 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:21 crc kubenswrapper[4735]: I1209 15:00:21.413317 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:21 crc kubenswrapper[4735]: I1209 15:00:21.413423 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:21 crc kubenswrapper[4735]: E1209 15:00:21.414491 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:21 crc kubenswrapper[4735]: I1209 15:00:21.414552 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:21 crc kubenswrapper[4735]: E1209 15:00:21.414627 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:21 crc kubenswrapper[4735]: E1209 15:00:21.414735 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:22 crc kubenswrapper[4735]: I1209 15:00:22.413937 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:22 crc kubenswrapper[4735]: E1209 15:00:22.414057 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:23 crc kubenswrapper[4735]: I1209 15:00:23.413651 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:23 crc kubenswrapper[4735]: I1209 15:00:23.413656 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:23 crc kubenswrapper[4735]: I1209 15:00:23.413774 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:23 crc kubenswrapper[4735]: E1209 15:00:23.414146 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:23 crc kubenswrapper[4735]: E1209 15:00:23.414184 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:23 crc kubenswrapper[4735]: E1209 15:00:23.414242 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:24 crc kubenswrapper[4735]: I1209 15:00:24.412983 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:24 crc kubenswrapper[4735]: E1209 15:00:24.413129 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:25 crc kubenswrapper[4735]: I1209 15:00:25.413049 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:25 crc kubenswrapper[4735]: I1209 15:00:25.413070 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:25 crc kubenswrapper[4735]: I1209 15:00:25.413278 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:25 crc kubenswrapper[4735]: E1209 15:00:25.413363 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:25 crc kubenswrapper[4735]: E1209 15:00:25.413416 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:25 crc kubenswrapper[4735]: E1209 15:00:25.413617 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:26 crc kubenswrapper[4735]: I1209 15:00:26.413384 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:26 crc kubenswrapper[4735]: E1209 15:00:26.413652 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:27 crc kubenswrapper[4735]: I1209 15:00:27.413580 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:27 crc kubenswrapper[4735]: I1209 15:00:27.413666 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:27 crc kubenswrapper[4735]: E1209 15:00:27.413701 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:27 crc kubenswrapper[4735]: I1209 15:00:27.413586 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:27 crc kubenswrapper[4735]: E1209 15:00:27.413780 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:27 crc kubenswrapper[4735]: E1209 15:00:27.413934 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:28 crc kubenswrapper[4735]: I1209 15:00:28.413457 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:28 crc kubenswrapper[4735]: E1209 15:00:28.413711 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:29 crc kubenswrapper[4735]: I1209 15:00:29.413942 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:29 crc kubenswrapper[4735]: I1209 15:00:29.414283 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:29 crc kubenswrapper[4735]: E1209 15:00:29.414340 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:29 crc kubenswrapper[4735]: I1209 15:00:29.414025 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:29 crc kubenswrapper[4735]: E1209 15:00:29.414566 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:29 crc kubenswrapper[4735]: E1209 15:00:29.414611 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:30 crc kubenswrapper[4735]: I1209 15:00:30.413597 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:30 crc kubenswrapper[4735]: E1209 15:00:30.413730 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:30 crc kubenswrapper[4735]: I1209 15:00:30.414207 4735 scope.go:117] "RemoveContainer" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 15:00:30 crc kubenswrapper[4735]: E1209 15:00:30.414352 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qblcd_openshift-ovn-kubernetes(9374566a-4662-4e98-ae18-6f52468332b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" Dec 09 15:00:31 crc kubenswrapper[4735]: I1209 15:00:31.413214 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:31 crc kubenswrapper[4735]: I1209 15:00:31.413347 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:31 crc kubenswrapper[4735]: E1209 15:00:31.413341 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:31 crc kubenswrapper[4735]: E1209 15:00:31.414456 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:31 crc kubenswrapper[4735]: I1209 15:00:31.414648 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:31 crc kubenswrapper[4735]: E1209 15:00:31.414721 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:31 crc kubenswrapper[4735]: I1209 15:00:31.864447 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnf8f_67d17a09-b547-49cf-8195-5af12413f51c/kube-multus/1.log" Dec 09 15:00:31 crc kubenswrapper[4735]: I1209 15:00:31.864903 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnf8f_67d17a09-b547-49cf-8195-5af12413f51c/kube-multus/0.log" Dec 09 15:00:31 crc kubenswrapper[4735]: I1209 15:00:31.864954 4735 generic.go:334] "Generic (PLEG): container finished" podID="67d17a09-b547-49cf-8195-5af12413f51c" containerID="70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed" exitCode=1 Dec 09 15:00:31 crc kubenswrapper[4735]: I1209 15:00:31.864996 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnf8f" event={"ID":"67d17a09-b547-49cf-8195-5af12413f51c","Type":"ContainerDied","Data":"70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed"} Dec 09 15:00:31 crc kubenswrapper[4735]: I1209 15:00:31.865033 4735 scope.go:117] "RemoveContainer" containerID="8aacd589a9c4b867d9d8871a1138f657c8cad5dc346aab3dfd7c2374b4c60f18" Dec 09 15:00:31 crc kubenswrapper[4735]: I1209 15:00:31.865338 4735 scope.go:117] "RemoveContainer" containerID="70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed" Dec 09 15:00:31 crc kubenswrapper[4735]: E1209 15:00:31.865504 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xnf8f_openshift-multus(67d17a09-b547-49cf-8195-5af12413f51c)\"" pod="openshift-multus/multus-xnf8f" podUID="67d17a09-b547-49cf-8195-5af12413f51c" Dec 09 15:00:31 crc kubenswrapper[4735]: I1209 15:00:31.879112 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wrc95" podStartSLOduration=93.879087018 podStartE2EDuration="1m33.879087018s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:14.831460339 +0000 UTC m=+93.756298966" watchObservedRunningTime="2025-12-09 15:00:31.879087018 +0000 UTC m=+110.803925646" Dec 09 15:00:32 crc kubenswrapper[4735]: I1209 15:00:32.413604 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:32 crc kubenswrapper[4735]: E1209 15:00:32.413712 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:32 crc kubenswrapper[4735]: I1209 15:00:32.868562 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnf8f_67d17a09-b547-49cf-8195-5af12413f51c/kube-multus/1.log" Dec 09 15:00:33 crc kubenswrapper[4735]: I1209 15:00:33.413326 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:33 crc kubenswrapper[4735]: I1209 15:00:33.413413 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:33 crc kubenswrapper[4735]: E1209 15:00:33.413442 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:33 crc kubenswrapper[4735]: E1209 15:00:33.413592 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:33 crc kubenswrapper[4735]: I1209 15:00:33.413629 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:33 crc kubenswrapper[4735]: E1209 15:00:33.413756 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:34 crc kubenswrapper[4735]: I1209 15:00:34.413245 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:34 crc kubenswrapper[4735]: E1209 15:00:34.413911 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:35 crc kubenswrapper[4735]: I1209 15:00:35.413662 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:35 crc kubenswrapper[4735]: I1209 15:00:35.413738 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:35 crc kubenswrapper[4735]: I1209 15:00:35.414029 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:35 crc kubenswrapper[4735]: E1209 15:00:35.414297 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:35 crc kubenswrapper[4735]: E1209 15:00:35.414760 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:35 crc kubenswrapper[4735]: E1209 15:00:35.414974 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:36 crc kubenswrapper[4735]: I1209 15:00:36.413149 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:36 crc kubenswrapper[4735]: E1209 15:00:36.413288 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:37 crc kubenswrapper[4735]: I1209 15:00:37.413598 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:37 crc kubenswrapper[4735]: I1209 15:00:37.413672 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:37 crc kubenswrapper[4735]: E1209 15:00:37.413710 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:37 crc kubenswrapper[4735]: E1209 15:00:37.413824 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:37 crc kubenswrapper[4735]: I1209 15:00:37.413852 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:37 crc kubenswrapper[4735]: E1209 15:00:37.413919 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:38 crc kubenswrapper[4735]: I1209 15:00:38.412956 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:38 crc kubenswrapper[4735]: E1209 15:00:38.413074 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:39 crc kubenswrapper[4735]: I1209 15:00:39.413282 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:39 crc kubenswrapper[4735]: I1209 15:00:39.413287 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:39 crc kubenswrapper[4735]: I1209 15:00:39.413286 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:39 crc kubenswrapper[4735]: E1209 15:00:39.413660 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:39 crc kubenswrapper[4735]: E1209 15:00:39.414062 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:39 crc kubenswrapper[4735]: E1209 15:00:39.414135 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:40 crc kubenswrapper[4735]: I1209 15:00:40.413240 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:40 crc kubenswrapper[4735]: E1209 15:00:40.413617 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:41 crc kubenswrapper[4735]: I1209 15:00:41.413592 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:41 crc kubenswrapper[4735]: I1209 15:00:41.413635 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:41 crc kubenswrapper[4735]: I1209 15:00:41.413791 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:41 crc kubenswrapper[4735]: E1209 15:00:41.414977 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:41 crc kubenswrapper[4735]: E1209 15:00:41.415016 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:41 crc kubenswrapper[4735]: E1209 15:00:41.415057 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:41 crc kubenswrapper[4735]: E1209 15:00:41.431038 4735 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 09 15:00:41 crc kubenswrapper[4735]: E1209 15:00:41.486237 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:00:42 crc kubenswrapper[4735]: I1209 15:00:42.413821 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:42 crc kubenswrapper[4735]: E1209 15:00:42.414411 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:43 crc kubenswrapper[4735]: I1209 15:00:43.413730 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:43 crc kubenswrapper[4735]: I1209 15:00:43.413833 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:43 crc kubenswrapper[4735]: I1209 15:00:43.413840 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:43 crc kubenswrapper[4735]: E1209 15:00:43.414206 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:43 crc kubenswrapper[4735]: E1209 15:00:43.414090 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:43 crc kubenswrapper[4735]: I1209 15:00:43.414091 4735 scope.go:117] "RemoveContainer" containerID="70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed" Dec 09 15:00:43 crc kubenswrapper[4735]: E1209 15:00:43.414398 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:43 crc kubenswrapper[4735]: I1209 15:00:43.898730 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnf8f_67d17a09-b547-49cf-8195-5af12413f51c/kube-multus/1.log" Dec 09 15:00:43 crc kubenswrapper[4735]: I1209 15:00:43.899161 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnf8f" event={"ID":"67d17a09-b547-49cf-8195-5af12413f51c","Type":"ContainerStarted","Data":"88bc3bb0b0d1327a3335aadac40c46ac49a79d37f1e1436ccb892cbaa982f40d"} Dec 09 15:00:44 crc kubenswrapper[4735]: I1209 15:00:44.413912 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:44 crc kubenswrapper[4735]: E1209 15:00:44.414047 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:45 crc kubenswrapper[4735]: I1209 15:00:45.413278 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:45 crc kubenswrapper[4735]: I1209 15:00:45.413406 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:45 crc kubenswrapper[4735]: I1209 15:00:45.413443 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:45 crc kubenswrapper[4735]: E1209 15:00:45.413890 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:45 crc kubenswrapper[4735]: E1209 15:00:45.413947 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:45 crc kubenswrapper[4735]: E1209 15:00:45.414088 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:45 crc kubenswrapper[4735]: I1209 15:00:45.414756 4735 scope.go:117] "RemoveContainer" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 15:00:45 crc kubenswrapper[4735]: I1209 15:00:45.907562 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/3.log" Dec 09 15:00:45 crc kubenswrapper[4735]: I1209 15:00:45.910239 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerStarted","Data":"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c"} Dec 09 15:00:45 crc kubenswrapper[4735]: I1209 15:00:45.910643 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 15:00:45 crc kubenswrapper[4735]: I1209 15:00:45.930384 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podStartSLOduration=107.930353046 podStartE2EDuration="1m47.930353046s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:45.929884266 +0000 UTC m=+124.854722894" watchObservedRunningTime="2025-12-09 15:00:45.930353046 +0000 UTC m=+124.855191674" Dec 09 15:00:46 crc kubenswrapper[4735]: I1209 15:00:46.083082 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jw8pr"] Dec 09 15:00:46 crc kubenswrapper[4735]: I1209 15:00:46.083216 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:46 crc kubenswrapper[4735]: E1209 15:00:46.083309 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:46 crc kubenswrapper[4735]: E1209 15:00:46.488099 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:00:47 crc kubenswrapper[4735]: I1209 15:00:47.413306 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:47 crc kubenswrapper[4735]: I1209 15:00:47.413360 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:47 crc kubenswrapper[4735]: E1209 15:00:47.413418 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:47 crc kubenswrapper[4735]: E1209 15:00:47.413490 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:47 crc kubenswrapper[4735]: I1209 15:00:47.413373 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:47 crc kubenswrapper[4735]: E1209 15:00:47.413610 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:48 crc kubenswrapper[4735]: I1209 15:00:48.413229 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:48 crc kubenswrapper[4735]: E1209 15:00:48.413360 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:49 crc kubenswrapper[4735]: I1209 15:00:49.413604 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:49 crc kubenswrapper[4735]: I1209 15:00:49.413633 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:49 crc kubenswrapper[4735]: E1209 15:00:49.413748 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:49 crc kubenswrapper[4735]: I1209 15:00:49.413899 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:49 crc kubenswrapper[4735]: E1209 15:00:49.414017 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:49 crc kubenswrapper[4735]: E1209 15:00:49.414157 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:50 crc kubenswrapper[4735]: I1209 15:00:50.413101 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:50 crc kubenswrapper[4735]: E1209 15:00:50.413303 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jw8pr" podUID="6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2" Dec 09 15:00:51 crc kubenswrapper[4735]: I1209 15:00:51.413835 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:51 crc kubenswrapper[4735]: I1209 15:00:51.413906 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:51 crc kubenswrapper[4735]: E1209 15:00:51.415266 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 09 15:00:51 crc kubenswrapper[4735]: I1209 15:00:51.415444 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:51 crc kubenswrapper[4735]: E1209 15:00:51.415601 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 09 15:00:51 crc kubenswrapper[4735]: E1209 15:00:51.415816 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 09 15:00:52 crc kubenswrapper[4735]: I1209 15:00:52.413478 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:00:52 crc kubenswrapper[4735]: I1209 15:00:52.415296 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 09 15:00:52 crc kubenswrapper[4735]: I1209 15:00:52.415296 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 09 15:00:53 crc kubenswrapper[4735]: I1209 15:00:53.413580 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:00:53 crc kubenswrapper[4735]: I1209 15:00:53.413693 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:00:53 crc kubenswrapper[4735]: I1209 15:00:53.413581 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:00:53 crc kubenswrapper[4735]: I1209 15:00:53.416096 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 09 15:00:53 crc kubenswrapper[4735]: I1209 15:00:53.416096 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 09 15:00:53 crc kubenswrapper[4735]: I1209 15:00:53.416096 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 09 15:00:53 crc kubenswrapper[4735]: I1209 15:00:53.417121 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.342493 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.374200 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t5fvv"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.374849 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.375901 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fpbzm"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.376337 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.377249 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jgd5m"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.377336 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.377461 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.377595 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.377803 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.379029 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r4hp5"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.379627 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.380077 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.380128 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.380182 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.380244 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.380301 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.380416 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.380455 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.380933 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.380963 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.381017 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.381413 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.388371 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.388763 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.389040 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.389536 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.389667 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.389928 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.390048 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.390237 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.390362 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.390497 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.392119 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.392172 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.392220 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.392645 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.393029 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.393047 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.395919 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.396662 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.398312 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-c4mlr"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.398777 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.398957 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.399952 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.403947 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.411179 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.411687 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.412021 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.412063 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.412145 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.412274 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.412338 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.412469 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.412541 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.412556 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.412615 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.412626 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.413110 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.413372 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.413463 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bxswn"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.414010 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.414619 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p77wx"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.414857 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.415161 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.415223 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.415268 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.415427 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.415477 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.415550 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.415599 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.415689 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.415783 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.415871 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.415952 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.416103 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.416140 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.416173 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.416271 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.416412 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-56tt8"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.416489 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.416817 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.417345 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.420214 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.420379 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxscb"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.420831 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.421073 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.421115 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.421148 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.421497 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.422429 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-r2xnv"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.422485 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.422672 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.422759 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.422783 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.422894 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.423063 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.423230 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-r2xnv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.423369 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gwkvf"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.423696 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.423873 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.424550 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.425066 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.425084 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.425459 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.425672 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.425805 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.425938 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.425992 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.426014 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.426092 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.426421 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.426685 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.426794 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.426935 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.426951 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.427075 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.427152 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.427261 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.427325 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.429943 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.430098 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.431373 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.431668 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.432539 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.437358 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.437507 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.437916 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.438738 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.440225 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.441659 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.443528 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-n8ljm"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.443554 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.444927 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df271b84-5513-4840-9faa-9b66e4fd3487-config\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.445021 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qs7q\" (UniqueName: \"kubernetes.io/projected/df271b84-5513-4840-9faa-9b66e4fd3487-kube-api-access-6qs7q\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.445074 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-trusted-ca\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.445102 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-serving-cert\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.445138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a42c9c7-4bfa-4972-baa1-eb2076d9c5be-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zrdkv\" (UID: \"0a42c9c7-4bfa-4972-baa1-eb2076d9c5be\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.445160 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkdm8\" (UniqueName: \"kubernetes.io/projected/8eefb88b-418a-4287-9b76-7e4a54d1a461-kube-api-access-hkdm8\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbnbs\" (UID: \"8eefb88b-418a-4287-9b76-7e4a54d1a461\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.445224 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8eefb88b-418a-4287-9b76-7e4a54d1a461-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbnbs\" (UID: \"8eefb88b-418a-4287-9b76-7e4a54d1a461\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.445443 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/df271b84-5513-4840-9faa-9b66e4fd3487-etcd-ca\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.445462 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a186d2c-8f55-4033-9154-b4ff929c9a98-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.445482 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a186d2c-8f55-4033-9154-b4ff929c9a98-images\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.448636 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.449174 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.449337 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.449531 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.449983 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.450008 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.445503 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb2gl\" (UniqueName: \"kubernetes.io/projected/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-kube-api-access-sb2gl\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.454809 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df271b84-5513-4840-9faa-9b66e4fd3487-serving-cert\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.454846 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df271b84-5513-4840-9faa-9b66e4fd3487-etcd-client\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.454899 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2s6h\" (UniqueName: \"kubernetes.io/projected/0a42c9c7-4bfa-4972-baa1-eb2076d9c5be-kube-api-access-s2s6h\") pod \"openshift-apiserver-operator-796bbdcf4f-zrdkv\" (UID: \"0a42c9c7-4bfa-4972-baa1-eb2076d9c5be\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.455036 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eefb88b-418a-4287-9b76-7e4a54d1a461-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbnbs\" (UID: \"8eefb88b-418a-4287-9b76-7e4a54d1a461\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.455614 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-config\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.455648 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a186d2c-8f55-4033-9154-b4ff929c9a98-config\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.461611 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.461650 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.461756 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.461884 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.455670 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a42c9c7-4bfa-4972-baa1-eb2076d9c5be-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zrdkv\" (UID: \"0a42c9c7-4bfa-4972-baa1-eb2076d9c5be\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.462075 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.462122 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/df271b84-5513-4840-9faa-9b66e4fd3487-etcd-service-ca\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.462915 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.463118 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7chwx\" (UniqueName: \"kubernetes.io/projected/3a186d2c-8f55-4033-9154-b4ff929c9a98-kube-api-access-7chwx\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.463151 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.463815 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.464230 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.465604 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.468432 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.468643 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.469250 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.472175 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.472738 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-r248q"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.473359 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.473952 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.474327 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.474480 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.474675 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.476154 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.478018 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.478562 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.479143 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.479368 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.480042 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.480959 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.481558 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.481998 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.482422 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.482908 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxhld"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.483981 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.484054 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.484468 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8vprh"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.484943 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.484071 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.487803 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.495608 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.496869 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.501891 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.510498 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.511177 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.512452 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r4hp5"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.516352 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.519223 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fpbzm"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.520200 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.521255 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxscb"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.522401 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-2hgjx"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.523331 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jgd5m"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.523415 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2hgjx" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.524080 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.524902 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.525943 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.526593 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.527458 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t5fvv"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.528308 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-c4mlr"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.529133 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.529983 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.530830 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.531647 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-56tt8"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.532484 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gwkvf"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.533357 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.534176 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-r248q"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.535034 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.535943 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.536241 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.536952 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.537789 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.539661 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.539701 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bxswn"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.540578 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.541053 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.542110 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p77wx"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.543110 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.543937 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.544785 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.546099 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.548050 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8vprh"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.548448 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-r2xnv"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.549345 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.550478 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6kh8f"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.551081 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ms5pn"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.551388 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6kh8f" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.552255 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxhld"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.552349 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.553174 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.554329 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6kh8f"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.555363 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ms5pn"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.556500 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566197 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df271b84-5513-4840-9faa-9b66e4fd3487-serving-cert\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566245 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df271b84-5513-4840-9faa-9b66e4fd3487-etcd-client\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566280 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2s6h\" (UniqueName: \"kubernetes.io/projected/0a42c9c7-4bfa-4972-baa1-eb2076d9c5be-kube-api-access-s2s6h\") pod \"openshift-apiserver-operator-796bbdcf4f-zrdkv\" (UID: \"0a42c9c7-4bfa-4972-baa1-eb2076d9c5be\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566332 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eefb88b-418a-4287-9b76-7e4a54d1a461-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbnbs\" (UID: \"8eefb88b-418a-4287-9b76-7e4a54d1a461\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566366 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-config\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566411 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a186d2c-8f55-4033-9154-b4ff929c9a98-config\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566439 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a42c9c7-4bfa-4972-baa1-eb2076d9c5be-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zrdkv\" (UID: \"0a42c9c7-4bfa-4972-baa1-eb2076d9c5be\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566470 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/df271b84-5513-4840-9faa-9b66e4fd3487-etcd-service-ca\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566531 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7chwx\" (UniqueName: \"kubernetes.io/projected/3a186d2c-8f55-4033-9154-b4ff929c9a98-kube-api-access-7chwx\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566557 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df271b84-5513-4840-9faa-9b66e4fd3487-config\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566580 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qs7q\" (UniqueName: \"kubernetes.io/projected/df271b84-5513-4840-9faa-9b66e4fd3487-kube-api-access-6qs7q\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566601 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-trusted-ca\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566624 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-serving-cert\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566643 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a42c9c7-4bfa-4972-baa1-eb2076d9c5be-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zrdkv\" (UID: \"0a42c9c7-4bfa-4972-baa1-eb2076d9c5be\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566666 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkdm8\" (UniqueName: \"kubernetes.io/projected/8eefb88b-418a-4287-9b76-7e4a54d1a461-kube-api-access-hkdm8\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbnbs\" (UID: \"8eefb88b-418a-4287-9b76-7e4a54d1a461\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566691 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8eefb88b-418a-4287-9b76-7e4a54d1a461-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbnbs\" (UID: \"8eefb88b-418a-4287-9b76-7e4a54d1a461\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566712 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/df271b84-5513-4840-9faa-9b66e4fd3487-etcd-ca\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566739 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a186d2c-8f55-4033-9154-b4ff929c9a98-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566757 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a186d2c-8f55-4033-9154-b4ff929c9a98-images\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.566778 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb2gl\" (UniqueName: \"kubernetes.io/projected/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-kube-api-access-sb2gl\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.567696 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df271b84-5513-4840-9faa-9b66e4fd3487-config\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.568342 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-config\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.568536 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a186d2c-8f55-4033-9154-b4ff929c9a98-config\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.569016 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eefb88b-418a-4287-9b76-7e4a54d1a461-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbnbs\" (UID: \"8eefb88b-418a-4287-9b76-7e4a54d1a461\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.570054 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-trusted-ca\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.570113 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/df271b84-5513-4840-9faa-9b66e4fd3487-etcd-service-ca\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.570841 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/df271b84-5513-4840-9faa-9b66e4fd3487-etcd-ca\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.571765 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a42c9c7-4bfa-4972-baa1-eb2076d9c5be-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zrdkv\" (UID: \"0a42c9c7-4bfa-4972-baa1-eb2076d9c5be\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.572255 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df271b84-5513-4840-9faa-9b66e4fd3487-serving-cert\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.572611 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a186d2c-8f55-4033-9154-b4ff929c9a98-images\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.573736 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df271b84-5513-4840-9faa-9b66e4fd3487-etcd-client\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.573752 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8eefb88b-418a-4287-9b76-7e4a54d1a461-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbnbs\" (UID: \"8eefb88b-418a-4287-9b76-7e4a54d1a461\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.574494 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-serving-cert\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.575784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a42c9c7-4bfa-4972-baa1-eb2076d9c5be-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zrdkv\" (UID: \"0a42c9c7-4bfa-4972-baa1-eb2076d9c5be\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.576462 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.577074 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a186d2c-8f55-4033-9154-b4ff929c9a98-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.594096 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8r6dv"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.595116 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8r6dv" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.596875 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.601447 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8r6dv"] Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.616688 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.643884 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.656177 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.677251 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.696749 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.716769 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.736396 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.756972 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.777625 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.797648 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.817344 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.837140 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.857023 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.876242 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.896367 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.937424 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.956220 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.977320 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 09 15:00:54 crc kubenswrapper[4735]: I1209 15:00:54.997084 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.016738 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.036102 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.056768 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.077957 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.097131 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.116550 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.137646 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.157356 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.177031 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.197415 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.217038 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.237323 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.256810 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.277207 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.297068 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.317105 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.337185 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.356388 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.376250 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.397436 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.416957 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.437038 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.456753 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.475141 4735 request.go:700] Waited for 1.005693569s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0 Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.476184 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.496589 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.517013 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.537065 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.556630 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.577734 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.596341 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.617083 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.636694 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.656463 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.677149 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.697218 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.716042 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.736780 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.756791 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.776891 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.796663 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.815967 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.836784 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.855985 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.876372 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.897140 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.936690 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.956985 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 09 15:00:55 crc kubenswrapper[4735]: I1209 15:00:55.976203 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.002091 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.016661 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.036678 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.057204 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.077047 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.096870 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.116290 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.136307 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.157092 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.176629 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.197348 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.217080 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.237135 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.256729 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.277051 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.296289 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.316627 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.336845 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.356992 4735 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.391058 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb2gl\" (UniqueName: \"kubernetes.io/projected/d75ebf60-0807-4fb2-a38b-ab3dc0e8793a-kube-api-access-sb2gl\") pod \"console-operator-58897d9998-56tt8\" (UID: \"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a\") " pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.408281 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2s6h\" (UniqueName: \"kubernetes.io/projected/0a42c9c7-4bfa-4972-baa1-eb2076d9c5be-kube-api-access-s2s6h\") pod \"openshift-apiserver-operator-796bbdcf4f-zrdkv\" (UID: \"0a42c9c7-4bfa-4972-baa1-eb2076d9c5be\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.428295 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkdm8\" (UniqueName: \"kubernetes.io/projected/8eefb88b-418a-4287-9b76-7e4a54d1a461-kube-api-access-hkdm8\") pod \"openshift-controller-manager-operator-756b6f6bc6-bbnbs\" (UID: \"8eefb88b-418a-4287-9b76-7e4a54d1a461\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.447659 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qs7q\" (UniqueName: \"kubernetes.io/projected/df271b84-5513-4840-9faa-9b66e4fd3487-kube-api-access-6qs7q\") pod \"etcd-operator-b45778765-bxswn\" (UID: \"df271b84-5513-4840-9faa-9b66e4fd3487\") " pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.468015 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7chwx\" (UniqueName: \"kubernetes.io/projected/3a186d2c-8f55-4033-9154-b4ff929c9a98-kube-api-access-7chwx\") pod \"machine-api-operator-5694c8668f-r4hp5\" (UID: \"3a186d2c-8f55-4033-9154-b4ff929c9a98\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.475749 4735 request.go:700] Waited for 1.880380099s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Ddns-default&limit=500&resourceVersion=0 Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.476850 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.497078 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.517100 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.539539 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.553202 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587423 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da912d07-0a05-4d1c-b042-82d8a3b23467-audit-dir\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587472 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-service-ca\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587503 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdmf\" (UniqueName: \"kubernetes.io/projected/b835f641-1777-4869-8ae7-161e8f528229-kube-api-access-qrdmf\") pod \"downloads-7954f5f757-r2xnv\" (UID: \"b835f641-1777-4869-8ae7-161e8f528229\") " pod="openshift-console/downloads-7954f5f757-r2xnv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587549 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587597 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-tls\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587649 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587673 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-service-ca-bundle\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587695 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da912d07-0a05-4d1c-b042-82d8a3b23467-etcd-client\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587720 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-oauth-serving-cert\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587739 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwsr\" (UniqueName: \"kubernetes.io/projected/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-kube-api-access-dqwsr\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587760 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8522db03-eed8-439b-a1bd-afe0b724a615-auth-proxy-config\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587783 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53061c50-1a0f-4496-a734-7a8d27f65fe6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rkptn\" (UID: \"53061c50-1a0f-4496-a734-7a8d27f65fe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587809 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsw9l\" (UniqueName: \"kubernetes.io/projected/10df02c0-bbd4-4021-acf6-311c2186ff9e-kube-api-access-fsw9l\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587859 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzghx\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-kube-api-access-jzghx\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587884 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8522db03-eed8-439b-a1bd-afe0b724a615-machine-approver-tls\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587903 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf-metrics-tls\") pod \"dns-operator-744455d44c-p77wx\" (UID: \"e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf\") " pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587926 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587948 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-image-import-ca\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.587986 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-encryption-config\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588003 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-audit-dir\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588023 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vbxs\" (UniqueName: \"kubernetes.io/projected/bfe12755-b370-474e-b856-82522f9b38d0-kube-api-access-6vbxs\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588051 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588076 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588097 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/525bce41-3834-48ac-a687-ce995171d333-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588113 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8cfq\" (UniqueName: \"kubernetes.io/projected/da912d07-0a05-4d1c-b042-82d8a3b23467-kube-api-access-b8cfq\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588136 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-serving-cert\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588156 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588176 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/da912d07-0a05-4d1c-b042-82d8a3b23467-encryption-config\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588216 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588242 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzvpb\" (UniqueName: \"kubernetes.io/projected/e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf-kube-api-access-kzvpb\") pod \"dns-operator-744455d44c-p77wx\" (UID: \"e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf\") " pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588304 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588430 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588456 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588479 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7j4x\" (UniqueName: \"kubernetes.io/projected/525bce41-3834-48ac-a687-ce995171d333-kube-api-access-b7j4x\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588532 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-etcd-serving-ca\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588556 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f37c5317-cf9b-44be-a65d-982bbe6a0473-trusted-ca\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588582 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-oauth-config\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588677 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nfp8\" (UniqueName: \"kubernetes.io/projected/cd154287-dca7-45d0-bd79-af3c4a117793-kube-api-access-5nfp8\") pod \"openshift-config-operator-7777fb866f-mr9zt\" (UID: \"cd154287-dca7-45d0-bd79-af3c4a117793\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:56 crc kubenswrapper[4735]: E1209 15:00:56.588743 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.088726148 +0000 UTC m=+136.013564776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzrwj\" (UniqueName: \"kubernetes.io/projected/2ac48d56-9f89-48f7-8840-48d2761beb97-kube-api-access-jzrwj\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588802 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588839 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cd154287-dca7-45d0-bd79-af3c4a117793-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mr9zt\" (UID: \"cd154287-dca7-45d0-bd79-af3c4a117793\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588921 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-trusted-ca\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588947 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f37c5317-cf9b-44be-a65d-982bbe6a0473-metrics-tls\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.588982 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589032 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-trusted-ca-bundle\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589062 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-client-ca\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589082 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-serving-cert\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589106 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53061c50-1a0f-4496-a734-7a8d27f65fe6-config\") pod \"kube-apiserver-operator-766d6c64bb-rkptn\" (UID: \"53061c50-1a0f-4496-a734-7a8d27f65fe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589125 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-certificates\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589147 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-serving-cert\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589191 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-config\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589214 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da912d07-0a05-4d1c-b042-82d8a3b23467-serving-cert\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589235 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-audit-policies\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589258 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ac48d56-9f89-48f7-8840-48d2761beb97-serving-cert\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589281 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589299 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/525bce41-3834-48ac-a687-ce995171d333-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589321 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-etcd-client\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589341 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd154287-dca7-45d0-bd79-af3c4a117793-serving-cert\") pod \"openshift-config-operator-7777fb866f-mr9zt\" (UID: \"cd154287-dca7-45d0-bd79-af3c4a117793\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589359 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f37c5317-cf9b-44be-a65d-982bbe6a0473-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589379 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-config\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589407 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-config\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589437 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc854\" (UniqueName: \"kubernetes.io/projected/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-kube-api-access-sc854\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589455 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a589e0e-3989-407f-a8c3-5b2391bddc09-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-96lk5\" (UID: \"5a589e0e-3989-407f-a8c3-5b2391bddc09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589477 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8522db03-eed8-439b-a1bd-afe0b724a615-config\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589541 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589563 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/525bce41-3834-48ac-a687-ce995171d333-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589586 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-bound-sa-token\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589605 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-config\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589621 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-audit\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589637 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589687 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-client-ca\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589731 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6jp4\" (UniqueName: \"kubernetes.io/projected/5a589e0e-3989-407f-a8c3-5b2391bddc09-kube-api-access-f6jp4\") pod \"cluster-samples-operator-665b6dd947-96lk5\" (UID: \"5a589e0e-3989-407f-a8c3-5b2391bddc09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589754 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d5fv\" (UniqueName: \"kubernetes.io/projected/f37c5317-cf9b-44be-a65d-982bbe6a0473-kube-api-access-2d5fv\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589774 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-policies\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589793 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589822 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnpxt\" (UniqueName: \"kubernetes.io/projected/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-kube-api-access-fnpxt\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589843 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-serving-cert\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589865 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-console-config\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589889 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx4r7\" (UniqueName: \"kubernetes.io/projected/8522db03-eed8-439b-a1bd-afe0b724a615-kube-api-access-wx4r7\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589909 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-dir\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589953 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/da912d07-0a05-4d1c-b042-82d8a3b23467-node-pullsecrets\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.589993 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.590017 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.590060 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53061c50-1a0f-4496-a734-7a8d27f65fe6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rkptn\" (UID: \"53061c50-1a0f-4496-a734-7a8d27f65fe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.627669 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.650049 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.688670 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.691544 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:56 crc kubenswrapper[4735]: E1209 15:00:56.691697 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.191655271 +0000 UTC m=+136.116493899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.691979 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-config\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692028 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/43e42b0e-4eb2-428a-9ee6-733f90aac431-node-bootstrap-token\") pod \"machine-config-server-2hgjx\" (UID: \"43e42b0e-4eb2-428a-9ee6-733f90aac431\") " pod="openshift-machine-config-operator/machine-config-server-2hgjx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692058 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2kh5\" (UniqueName: \"kubernetes.io/projected/ffd3099d-e717-438c-a2ce-591b598cd50e-kube-api-access-n2kh5\") pod \"dns-default-8r6dv\" (UID: \"ffd3099d-e717-438c-a2ce-591b598cd50e\") " pod="openshift-dns/dns-default-8r6dv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692093 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-serving-cert\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692111 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da912d07-0a05-4d1c-b042-82d8a3b23467-serving-cert\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692131 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-audit-policies\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692148 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/525bce41-3834-48ac-a687-ce995171d333-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-etcd-client\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692185 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd154287-dca7-45d0-bd79-af3c4a117793-serving-cert\") pod \"openshift-config-operator-7777fb866f-mr9zt\" (UID: \"cd154287-dca7-45d0-bd79-af3c4a117793\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692204 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f37c5317-cf9b-44be-a65d-982bbe6a0473-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692223 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f07aeefa-cf94-4c49-ac0a-bec93a1c65f6-cert\") pod \"ingress-canary-6kh8f\" (UID: \"f07aeefa-cf94-4c49-ac0a-bec93a1c65f6\") " pod="openshift-ingress-canary/ingress-canary-6kh8f" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692250 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkp69\" (UniqueName: \"kubernetes.io/projected/f07aeefa-cf94-4c49-ac0a-bec93a1c65f6-kube-api-access-nkp69\") pod \"ingress-canary-6kh8f\" (UID: \"f07aeefa-cf94-4c49-ac0a-bec93a1c65f6\") " pod="openshift-ingress-canary/ingress-canary-6kh8f" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692272 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45p6k\" (UniqueName: \"kubernetes.io/projected/32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd-kube-api-access-45p6k\") pod \"service-ca-operator-777779d784-5rzqx\" (UID: \"32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692312 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9tcw\" (UniqueName: \"kubernetes.io/projected/e4daca1a-712b-40cb-8943-303ae8542cab-kube-api-access-m9tcw\") pod \"kube-storage-version-migrator-operator-b67b599dd-pcdrs\" (UID: \"e4daca1a-712b-40cb-8943-303ae8542cab\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692333 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692350 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/525bce41-3834-48ac-a687-ce995171d333-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692368 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-bound-sa-token\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692402 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-config\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692429 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.692496 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-client-ca\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.693766 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6jp4\" (UniqueName: \"kubernetes.io/projected/5a589e0e-3989-407f-a8c3-5b2391bddc09-kube-api-access-f6jp4\") pod \"cluster-samples-operator-665b6dd947-96lk5\" (UID: \"5a589e0e-3989-407f-a8c3-5b2391bddc09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.693904 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-policies\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.693962 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/43e42b0e-4eb2-428a-9ee6-733f90aac431-certs\") pod \"machine-config-server-2hgjx\" (UID: \"43e42b0e-4eb2-428a-9ee6-733f90aac431\") " pod="openshift-machine-config-operator/machine-config-server-2hgjx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.693999 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-console-config\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.694129 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx4r7\" (UniqueName: \"kubernetes.io/projected/8522db03-eed8-439b-a1bd-afe0b724a615-kube-api-access-wx4r7\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.694165 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-dir\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.694204 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-plugins-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.694597 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-audit-policies\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695153 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-console-config\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695204 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-dir\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695238 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ffd3099d-e717-438c-a2ce-591b598cd50e-metrics-tls\") pod \"dns-default-8r6dv\" (UID: \"ffd3099d-e717-438c-a2ce-591b598cd50e\") " pod="openshift-dns/dns-default-8r6dv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695296 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rxhld\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695316 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695335 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/db303c23-142d-43ce-ba09-59581443cc4e-srv-cert\") pod \"olm-operator-6b444d44fb-qwjzw\" (UID: \"db303c23-142d-43ce-ba09-59581443cc4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695376 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-policies\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695395 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695466 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ec89240-f1c2-444a-9114-5cf83fab7e9d-signing-key\") pod \"service-ca-9c57cc56f-8vprh\" (UID: \"1ec89240-f1c2-444a-9114-5cf83fab7e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695574 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695603 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/105fe85b-f861-46f6-b12b-139c5f0a7780-images\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695676 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-tls\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695703 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695727 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-csi-data-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695850 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da912d07-0a05-4d1c-b042-82d8a3b23467-etcd-client\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695904 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-oauth-serving-cert\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695926 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53061c50-1a0f-4496-a734-7a8d27f65fe6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rkptn\" (UID: \"53061c50-1a0f-4496-a734-7a8d27f65fe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695934 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-config\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.695952 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzghx\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-kube-api-access-jzghx\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.696004 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8522db03-eed8-439b-a1bd-afe0b724a615-machine-approver-tls\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.696026 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.696051 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd-config\") pod \"service-ca-operator-777779d784-5rzqx\" (UID: \"32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.696053 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.696314 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-config\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.696574 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-client-ca\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.696742 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69643ed9-426d-4f75-acf0-2871fd6f6f9e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5xt67\" (UID: \"69643ed9-426d-4f75-acf0-2871fd6f6f9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.696780 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-mountpoint-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.696809 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4da02f65-748e-42ee-82d8-4cd5445d9fab-default-certificate\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.696966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.697824 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-encryption-config\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.697883 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-audit-dir\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.697924 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbxs\" (UniqueName: \"kubernetes.io/projected/bfe12755-b370-474e-b856-82522f9b38d0-kube-api-access-6vbxs\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.697983 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: E1209 15:00:56.698703 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.198679168 +0000 UTC m=+136.123517796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.698988 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9348986d-f923-4e2d-86c2-6b9886b736f5-srv-cert\") pod \"catalog-operator-68c6474976-xxlf7\" (UID: \"9348986d-f923-4e2d-86c2-6b9886b736f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699028 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/381225e4-030b-401b-a1c6-8926f3a806b7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w8m5z\" (UID: \"381225e4-030b-401b-a1c6-8926f3a806b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699066 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66526\" (UniqueName: \"kubernetes.io/projected/db303c23-142d-43ce-ba09-59581443cc4e-kube-api-access-66526\") pod \"olm-operator-6b444d44fb-qwjzw\" (UID: \"db303c23-142d-43ce-ba09-59581443cc4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699093 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-audit-dir\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699163 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-serving-cert\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699203 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699223 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ec89240-f1c2-444a-9114-5cf83fab7e9d-signing-cabundle\") pod \"service-ca-9c57cc56f-8vprh\" (UID: \"1ec89240-f1c2-444a-9114-5cf83fab7e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699257 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9348986d-f923-4e2d-86c2-6b9886b736f5-profile-collector-cert\") pod \"catalog-operator-68c6474976-xxlf7\" (UID: \"9348986d-f923-4e2d-86c2-6b9886b736f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699278 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699297 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699346 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7j4x\" (UniqueName: \"kubernetes.io/projected/525bce41-3834-48ac-a687-ce995171d333-kube-api-access-b7j4x\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699366 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhdtp\" (UniqueName: \"kubernetes.io/projected/3e9eeba2-ec44-48dd-9325-9966670acd75-kube-api-access-fhdtp\") pod \"multus-admission-controller-857f4d67dd-r248q\" (UID: \"3e9eeba2-ec44-48dd-9325-9966670acd75\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699399 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-etcd-serving-ca\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699435 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd66c484-656a-4a2e-a3e9-2b7a32ba2def-config\") pod \"kube-controller-manager-operator-78b949d7b-wtqjc\" (UID: \"fd66c484-656a-4a2e-a3e9-2b7a32ba2def\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699492 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nfp8\" (UniqueName: \"kubernetes.io/projected/cd154287-dca7-45d0-bd79-af3c4a117793-kube-api-access-5nfp8\") pod \"openshift-config-operator-7777fb866f-mr9zt\" (UID: \"cd154287-dca7-45d0-bd79-af3c4a117793\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699531 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svkn4\" (UniqueName: \"kubernetes.io/projected/105fe85b-f861-46f6-b12b-139c5f0a7780-kube-api-access-svkn4\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699550 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/db303c23-142d-43ce-ba09-59581443cc4e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qwjzw\" (UID: \"db303c23-142d-43ce-ba09-59581443cc4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699572 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzrwj\" (UniqueName: \"kubernetes.io/projected/2ac48d56-9f89-48f7-8840-48d2761beb97-kube-api-access-jzrwj\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699590 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7nb4\" (UniqueName: \"kubernetes.io/projected/43e42b0e-4eb2-428a-9ee6-733f90aac431-kube-api-access-s7nb4\") pod \"machine-config-server-2hgjx\" (UID: \"43e42b0e-4eb2-428a-9ee6-733f90aac431\") " pod="openshift-machine-config-operator/machine-config-server-2hgjx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699612 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f37c5317-cf9b-44be-a65d-982bbe6a0473-metrics-tls\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.699631 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4da02f65-748e-42ee-82d8-4cd5445d9fab-metrics-certs\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.700913 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-etcd-serving-ca\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.703579 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/525bce41-3834-48ac-a687-ce995171d333-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.703943 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-serving-cert\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.704150 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-encryption-config\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.704451 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da912d07-0a05-4d1c-b042-82d8a3b23467-serving-cert\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.714940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.715910 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-oauth-serving-cert\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.717067 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f37c5317-cf9b-44be-a65d-982bbe6a0473-metrics-tls\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.717610 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-serving-cert\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.718058 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8522db03-eed8-439b-a1bd-afe0b724a615-machine-approver-tls\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.718682 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd154287-dca7-45d0-bd79-af3c4a117793-serving-cert\") pod \"openshift-config-operator-7777fb866f-mr9zt\" (UID: \"cd154287-dca7-45d0-bd79-af3c4a117793\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.718866 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-tls\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.718933 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69643ed9-426d-4f75-acf0-2871fd6f6f9e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5xt67\" (UID: \"69643ed9-426d-4f75-acf0-2871fd6f6f9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.719318 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-etcd-client\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.719320 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.719351 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da912d07-0a05-4d1c-b042-82d8a3b23467-etcd-client\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.719546 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.719617 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c6c87b9-fe9c-45db-bfaa-031a208177db-webhook-cert\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.720158 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd3099d-e717-438c-a2ce-591b598cd50e-config-volume\") pod \"dns-default-8r6dv\" (UID: \"ffd3099d-e717-438c-a2ce-591b598cd50e\") " pod="openshift-dns/dns-default-8r6dv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.720231 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69643ed9-426d-4f75-acf0-2871fd6f6f9e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5xt67\" (UID: \"69643ed9-426d-4f75-acf0-2871fd6f6f9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.720264 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53061c50-1a0f-4496-a734-7a8d27f65fe6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rkptn\" (UID: \"53061c50-1a0f-4496-a734-7a8d27f65fe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.720271 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ac48d56-9f89-48f7-8840-48d2761beb97-serving-cert\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.720350 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.720397 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-config\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.720604 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnf88\" (UniqueName: \"kubernetes.io/projected/63bb7490-d246-4f4b-9db9-bb254344f4c9-kube-api-access-fnf88\") pod \"migrator-59844c95c7-rzc8z\" (UID: \"63bb7490-d246-4f4b-9db9-bb254344f4c9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.720656 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67a91130-56ef-4053-b062-ed2dcde04121-secret-volume\") pod \"collect-profiles-29421540-x8zxv\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.720721 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdvg7\" (UniqueName: \"kubernetes.io/projected/fb600470-3773-4c44-9069-8b8aa7c18bd6-kube-api-access-sdvg7\") pod \"machine-config-controller-84d6567774-6hv2p\" (UID: \"fb600470-3773-4c44-9069-8b8aa7c18bd6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.720823 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-config\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.720881 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc854\" (UniqueName: \"kubernetes.io/projected/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-kube-api-access-sc854\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.721060 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a589e0e-3989-407f-a8c3-5b2391bddc09-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-96lk5\" (UID: \"5a589e0e-3989-407f-a8c3-5b2391bddc09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.722165 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-config\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.722339 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8522db03-eed8-439b-a1bd-afe0b724a615-config\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.723037 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd66c484-656a-4a2e-a3e9-2b7a32ba2def-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wtqjc\" (UID: \"fd66c484-656a-4a2e-a3e9-2b7a32ba2def\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.723144 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105fe85b-f861-46f6-b12b-139c5f0a7780-proxy-tls\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.723608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8522db03-eed8-439b-a1bd-afe0b724a615-config\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.715187 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.723670 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-audit\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.723848 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75qd2\" (UniqueName: \"kubernetes.io/projected/4c6c87b9-fe9c-45db-bfaa-031a208177db-kube-api-access-75qd2\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.723896 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-config\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.723981 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d5fv\" (UniqueName: \"kubernetes.io/projected/f37c5317-cf9b-44be-a65d-982bbe6a0473-kube-api-access-2d5fv\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724033 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724116 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd-serving-cert\") pod \"service-ca-operator-777779d784-5rzqx\" (UID: \"32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724259 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-audit\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724153 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4c6c87b9-fe9c-45db-bfaa-031a208177db-tmpfs\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724504 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnpxt\" (UniqueName: \"kubernetes.io/projected/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-kube-api-access-fnpxt\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724553 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-serving-cert\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724581 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3e9eeba2-ec44-48dd-9325-9966670acd75-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-r248q\" (UID: \"3e9eeba2-ec44-48dd-9325-9966670acd75\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724631 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724666 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/da912d07-0a05-4d1c-b042-82d8a3b23467-node-pullsecrets\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724697 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-socket-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724728 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53061c50-1a0f-4496-a734-7a8d27f65fe6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rkptn\" (UID: \"53061c50-1a0f-4496-a734-7a8d27f65fe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724755 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724802 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4da02f65-748e-42ee-82d8-4cd5445d9fab-stats-auth\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724830 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d67w\" (UniqueName: \"kubernetes.io/projected/4da02f65-748e-42ee-82d8-4cd5445d9fab-kube-api-access-2d67w\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724857 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da912d07-0a05-4d1c-b042-82d8a3b23467-audit-dir\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724886 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-service-ca\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724914 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-registration-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724944 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdmf\" (UniqueName: \"kubernetes.io/projected/b835f641-1777-4869-8ae7-161e8f528229-kube-api-access-qrdmf\") pod \"downloads-7954f5f757-r2xnv\" (UID: \"b835f641-1777-4869-8ae7-161e8f528229\") " pod="openshift-console/downloads-7954f5f757-r2xnv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.724979 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/105fe85b-f861-46f6-b12b-139c5f0a7780-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725062 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-service-ca-bundle\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb600470-3773-4c44-9069-8b8aa7c18bd6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6hv2p\" (UID: \"fb600470-3773-4c44-9069-8b8aa7c18bd6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725188 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwsr\" (UniqueName: \"kubernetes.io/projected/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-kube-api-access-dqwsr\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725202 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r4hp5"] Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8522db03-eed8-439b-a1bd-afe0b724a615-auth-proxy-config\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725295 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsw9l\" (UniqueName: \"kubernetes.io/projected/10df02c0-bbd4-4021-acf6-311c2186ff9e-kube-api-access-fsw9l\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725352 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgchg\" (UniqueName: \"kubernetes.io/projected/381225e4-030b-401b-a1c6-8926f3a806b7-kube-api-access-tgchg\") pod \"control-plane-machine-set-operator-78cbb6b69f-w8m5z\" (UID: \"381225e4-030b-401b-a1c6-8926f3a806b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725443 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf-metrics-tls\") pod \"dns-operator-744455d44c-p77wx\" (UID: \"e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf\") " pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725487 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rxhld\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725546 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-image-import-ca\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725623 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b85cr\" (UniqueName: \"kubernetes.io/projected/fc05d12f-eaa9-48ae-b280-e449caed078c-kube-api-access-b85cr\") pod \"marketplace-operator-79b997595-rxhld\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725672 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725693 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4da02f65-748e-42ee-82d8-4cd5445d9fab-service-ca-bundle\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725718 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb600470-3773-4c44-9069-8b8aa7c18bd6-proxy-tls\") pod \"machine-config-controller-84d6567774-6hv2p\" (UID: \"fb600470-3773-4c44-9069-8b8aa7c18bd6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/525bce41-3834-48ac-a687-ce995171d333-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725781 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8522db03-eed8-439b-a1bd-afe0b724a615-auth-proxy-config\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725805 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8cfq\" (UniqueName: \"kubernetes.io/projected/da912d07-0a05-4d1c-b042-82d8a3b23467-kube-api-access-b8cfq\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725846 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c6c87b9-fe9c-45db-bfaa-031a208177db-apiservice-cert\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725878 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4daca1a-712b-40cb-8943-303ae8542cab-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pcdrs\" (UID: \"e4daca1a-712b-40cb-8943-303ae8542cab\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725905 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45gql\" (UniqueName: \"kubernetes.io/projected/67a91130-56ef-4053-b062-ed2dcde04121-kube-api-access-45gql\") pod \"collect-profiles-29421540-x8zxv\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.725935 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726004 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crq62\" (UniqueName: \"kubernetes.io/projected/45325644-f6e0-48dd-ac6f-05d02cbee704-kube-api-access-crq62\") pod \"package-server-manager-789f6589d5-vsnk7\" (UID: \"45325644-f6e0-48dd-ac6f-05d02cbee704\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726041 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726081 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/da912d07-0a05-4d1c-b042-82d8a3b23467-encryption-config\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726104 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726159 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzvpb\" (UniqueName: \"kubernetes.io/projected/e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf-kube-api-access-kzvpb\") pod \"dns-operator-744455d44c-p77wx\" (UID: \"e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf\") " pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726181 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f37c5317-cf9b-44be-a65d-982bbe6a0473-trusted-ca\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726202 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/45325644-f6e0-48dd-ac6f-05d02cbee704-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vsnk7\" (UID: \"45325644-f6e0-48dd-ac6f-05d02cbee704\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfgqw\" (UniqueName: \"kubernetes.io/projected/9348986d-f923-4e2d-86c2-6b9886b736f5-kube-api-access-cfgqw\") pod \"catalog-operator-68c6474976-xxlf7\" (UID: \"9348986d-f923-4e2d-86c2-6b9886b736f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726266 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-oauth-config\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726325 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cd154287-dca7-45d0-bd79-af3c4a117793-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mr9zt\" (UID: \"cd154287-dca7-45d0-bd79-af3c4a117793\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726464 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-trusted-ca\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726502 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4daca1a-712b-40cb-8943-303ae8542cab-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pcdrs\" (UID: \"e4daca1a-712b-40cb-8943-303ae8542cab\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726535 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67a91130-56ef-4053-b062-ed2dcde04121-config-volume\") pod \"collect-profiles-29421540-x8zxv\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726572 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x2x6\" (UniqueName: \"kubernetes.io/projected/2be1ce89-c4a7-4237-b0f3-221ac11f813a-kube-api-access-5x2x6\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726608 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-trusted-ca-bundle\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726639 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726683 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-client-ca\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726716 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-serving-cert\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726740 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd66c484-656a-4a2e-a3e9-2b7a32ba2def-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wtqjc\" (UID: \"fd66c484-656a-4a2e-a3e9-2b7a32ba2def\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726771 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhcm\" (UniqueName: \"kubernetes.io/projected/1ec89240-f1c2-444a-9114-5cf83fab7e9d-kube-api-access-dlhcm\") pod \"service-ca-9c57cc56f-8vprh\" (UID: \"1ec89240-f1c2-444a-9114-5cf83fab7e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726834 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-certificates\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.726867 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53061c50-1a0f-4496-a734-7a8d27f65fe6-config\") pod \"kube-apiserver-operator-766d6c64bb-rkptn\" (UID: \"53061c50-1a0f-4496-a734-7a8d27f65fe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.727671 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53061c50-1a0f-4496-a734-7a8d27f65fe6-config\") pod \"kube-apiserver-operator-766d6c64bb-rkptn\" (UID: \"53061c50-1a0f-4496-a734-7a8d27f65fe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.731153 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.731397 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.731833 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da912d07-0a05-4d1c-b042-82d8a3b23467-audit-dir\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.732043 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f37c5317-cf9b-44be-a65d-982bbe6a0473-trusted-ca\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.732907 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-service-ca\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.732980 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/da912d07-0a05-4d1c-b042-82d8a3b23467-node-pullsecrets\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.733475 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.734008 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cd154287-dca7-45d0-bd79-af3c4a117793-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mr9zt\" (UID: \"cd154287-dca7-45d0-bd79-af3c4a117793\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.734236 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.734654 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.734943 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f37c5317-cf9b-44be-a65d-982bbe6a0473-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.735201 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-certificates\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.735664 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-trusted-ca-bundle\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.735716 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-trusted-ca\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.736013 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/da912d07-0a05-4d1c-b042-82d8a3b23467-image-import-ca\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.736024 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/525bce41-3834-48ac-a687-ce995171d333-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.736319 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.736330 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-service-ca-bundle\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.736500 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-client-ca\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.738868 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.739018 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.739546 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.740470 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-serving-cert\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.740920 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/da912d07-0a05-4d1c-b042-82d8a3b23467-encryption-config\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.740944 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-oauth-config\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.741013 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf-metrics-tls\") pod \"dns-operator-744455d44c-p77wx\" (UID: \"e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf\") " pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.741026 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.741185 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ac48d56-9f89-48f7-8840-48d2761beb97-serving-cert\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.741992 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-serving-cert\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.742384 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a589e0e-3989-407f-a8c3-5b2391bddc09-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-96lk5\" (UID: \"5a589e0e-3989-407f-a8c3-5b2391bddc09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.748405 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv"] Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.769644 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-bound-sa-token\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.771942 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/525bce41-3834-48ac-a687-ce995171d333-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.801274 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx4r7\" (UniqueName: \"kubernetes.io/projected/8522db03-eed8-439b-a1bd-afe0b724a615-kube-api-access-wx4r7\") pod \"machine-approver-56656f9798-hwpt4\" (UID: \"8522db03-eed8-439b-a1bd-afe0b724a615\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.816823 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6jp4\" (UniqueName: \"kubernetes.io/projected/5a589e0e-3989-407f-a8c3-5b2391bddc09-kube-api-access-f6jp4\") pod \"cluster-samples-operator-665b6dd947-96lk5\" (UID: \"5a589e0e-3989-407f-a8c3-5b2391bddc09\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829085 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829425 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rxhld\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b85cr\" (UniqueName: \"kubernetes.io/projected/fc05d12f-eaa9-48ae-b280-e449caed078c-kube-api-access-b85cr\") pod \"marketplace-operator-79b997595-rxhld\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829481 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4da02f65-748e-42ee-82d8-4cd5445d9fab-service-ca-bundle\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829504 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb600470-3773-4c44-9069-8b8aa7c18bd6-proxy-tls\") pod \"machine-config-controller-84d6567774-6hv2p\" (UID: \"fb600470-3773-4c44-9069-8b8aa7c18bd6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829545 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c6c87b9-fe9c-45db-bfaa-031a208177db-apiservice-cert\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829591 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45gql\" (UniqueName: \"kubernetes.io/projected/67a91130-56ef-4053-b062-ed2dcde04121-kube-api-access-45gql\") pod \"collect-profiles-29421540-x8zxv\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829614 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4daca1a-712b-40cb-8943-303ae8542cab-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pcdrs\" (UID: \"e4daca1a-712b-40cb-8943-303ae8542cab\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829642 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crq62\" (UniqueName: \"kubernetes.io/projected/45325644-f6e0-48dd-ac6f-05d02cbee704-kube-api-access-crq62\") pod \"package-server-manager-789f6589d5-vsnk7\" (UID: \"45325644-f6e0-48dd-ac6f-05d02cbee704\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829673 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/45325644-f6e0-48dd-ac6f-05d02cbee704-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vsnk7\" (UID: \"45325644-f6e0-48dd-ac6f-05d02cbee704\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829695 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfgqw\" (UniqueName: \"kubernetes.io/projected/9348986d-f923-4e2d-86c2-6b9886b736f5-kube-api-access-cfgqw\") pod \"catalog-operator-68c6474976-xxlf7\" (UID: \"9348986d-f923-4e2d-86c2-6b9886b736f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829726 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4daca1a-712b-40cb-8943-303ae8542cab-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pcdrs\" (UID: \"e4daca1a-712b-40cb-8943-303ae8542cab\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829745 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67a91130-56ef-4053-b062-ed2dcde04121-config-volume\") pod \"collect-profiles-29421540-x8zxv\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829766 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x2x6\" (UniqueName: \"kubernetes.io/projected/2be1ce89-c4a7-4237-b0f3-221ac11f813a-kube-api-access-5x2x6\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829791 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd66c484-656a-4a2e-a3e9-2b7a32ba2def-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wtqjc\" (UID: \"fd66c484-656a-4a2e-a3e9-2b7a32ba2def\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829813 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhcm\" (UniqueName: \"kubernetes.io/projected/1ec89240-f1c2-444a-9114-5cf83fab7e9d-kube-api-access-dlhcm\") pod \"service-ca-9c57cc56f-8vprh\" (UID: \"1ec89240-f1c2-444a-9114-5cf83fab7e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829833 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/43e42b0e-4eb2-428a-9ee6-733f90aac431-node-bootstrap-token\") pod \"machine-config-server-2hgjx\" (UID: \"43e42b0e-4eb2-428a-9ee6-733f90aac431\") " pod="openshift-machine-config-operator/machine-config-server-2hgjx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829855 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2kh5\" (UniqueName: \"kubernetes.io/projected/ffd3099d-e717-438c-a2ce-591b598cd50e-kube-api-access-n2kh5\") pod \"dns-default-8r6dv\" (UID: \"ffd3099d-e717-438c-a2ce-591b598cd50e\") " pod="openshift-dns/dns-default-8r6dv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829883 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f07aeefa-cf94-4c49-ac0a-bec93a1c65f6-cert\") pod \"ingress-canary-6kh8f\" (UID: \"f07aeefa-cf94-4c49-ac0a-bec93a1c65f6\") " pod="openshift-ingress-canary/ingress-canary-6kh8f" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829904 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkp69\" (UniqueName: \"kubernetes.io/projected/f07aeefa-cf94-4c49-ac0a-bec93a1c65f6-kube-api-access-nkp69\") pod \"ingress-canary-6kh8f\" (UID: \"f07aeefa-cf94-4c49-ac0a-bec93a1c65f6\") " pod="openshift-ingress-canary/ingress-canary-6kh8f" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829925 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45p6k\" (UniqueName: \"kubernetes.io/projected/32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd-kube-api-access-45p6k\") pod \"service-ca-operator-777779d784-5rzqx\" (UID: \"32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829945 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9tcw\" (UniqueName: \"kubernetes.io/projected/e4daca1a-712b-40cb-8943-303ae8542cab-kube-api-access-m9tcw\") pod \"kube-storage-version-migrator-operator-b67b599dd-pcdrs\" (UID: \"e4daca1a-712b-40cb-8943-303ae8542cab\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829967 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/43e42b0e-4eb2-428a-9ee6-733f90aac431-certs\") pod \"machine-config-server-2hgjx\" (UID: \"43e42b0e-4eb2-428a-9ee6-733f90aac431\") " pod="openshift-machine-config-operator/machine-config-server-2hgjx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.829986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-plugins-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830004 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ffd3099d-e717-438c-a2ce-591b598cd50e-metrics-tls\") pod \"dns-default-8r6dv\" (UID: \"ffd3099d-e717-438c-a2ce-591b598cd50e\") " pod="openshift-dns/dns-default-8r6dv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830022 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rxhld\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830042 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/db303c23-142d-43ce-ba09-59581443cc4e-srv-cert\") pod \"olm-operator-6b444d44fb-qwjzw\" (UID: \"db303c23-142d-43ce-ba09-59581443cc4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830062 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ec89240-f1c2-444a-9114-5cf83fab7e9d-signing-key\") pod \"service-ca-9c57cc56f-8vprh\" (UID: \"1ec89240-f1c2-444a-9114-5cf83fab7e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/105fe85b-f861-46f6-b12b-139c5f0a7780-images\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830117 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-csi-data-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830149 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd-config\") pod \"service-ca-operator-777779d784-5rzqx\" (UID: \"32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830175 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69643ed9-426d-4f75-acf0-2871fd6f6f9e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5xt67\" (UID: \"69643ed9-426d-4f75-acf0-2871fd6f6f9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830196 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-mountpoint-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830233 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4da02f65-748e-42ee-82d8-4cd5445d9fab-default-certificate\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830266 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9348986d-f923-4e2d-86c2-6b9886b736f5-srv-cert\") pod \"catalog-operator-68c6474976-xxlf7\" (UID: \"9348986d-f923-4e2d-86c2-6b9886b736f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830287 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/381225e4-030b-401b-a1c6-8926f3a806b7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w8m5z\" (UID: \"381225e4-030b-401b-a1c6-8926f3a806b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830308 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66526\" (UniqueName: \"kubernetes.io/projected/db303c23-142d-43ce-ba09-59581443cc4e-kube-api-access-66526\") pod \"olm-operator-6b444d44fb-qwjzw\" (UID: \"db303c23-142d-43ce-ba09-59581443cc4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830328 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9348986d-f923-4e2d-86c2-6b9886b736f5-profile-collector-cert\") pod \"catalog-operator-68c6474976-xxlf7\" (UID: \"9348986d-f923-4e2d-86c2-6b9886b736f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830357 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhdtp\" (UniqueName: \"kubernetes.io/projected/3e9eeba2-ec44-48dd-9325-9966670acd75-kube-api-access-fhdtp\") pod \"multus-admission-controller-857f4d67dd-r248q\" (UID: \"3e9eeba2-ec44-48dd-9325-9966670acd75\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830374 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ec89240-f1c2-444a-9114-5cf83fab7e9d-signing-cabundle\") pod \"service-ca-9c57cc56f-8vprh\" (UID: \"1ec89240-f1c2-444a-9114-5cf83fab7e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830963 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd66c484-656a-4a2e-a3e9-2b7a32ba2def-config\") pod \"kube-controller-manager-operator-78b949d7b-wtqjc\" (UID: \"fd66c484-656a-4a2e-a3e9-2b7a32ba2def\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.830992 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svkn4\" (UniqueName: \"kubernetes.io/projected/105fe85b-f861-46f6-b12b-139c5f0a7780-kube-api-access-svkn4\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831013 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/db303c23-142d-43ce-ba09-59581443cc4e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qwjzw\" (UID: \"db303c23-142d-43ce-ba09-59581443cc4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7nb4\" (UniqueName: \"kubernetes.io/projected/43e42b0e-4eb2-428a-9ee6-733f90aac431-kube-api-access-s7nb4\") pod \"machine-config-server-2hgjx\" (UID: \"43e42b0e-4eb2-428a-9ee6-733f90aac431\") " pod="openshift-machine-config-operator/machine-config-server-2hgjx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831062 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4da02f65-748e-42ee-82d8-4cd5445d9fab-metrics-certs\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831091 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69643ed9-426d-4f75-acf0-2871fd6f6f9e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5xt67\" (UID: \"69643ed9-426d-4f75-acf0-2871fd6f6f9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831111 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c6c87b9-fe9c-45db-bfaa-031a208177db-webhook-cert\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831131 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd3099d-e717-438c-a2ce-591b598cd50e-config-volume\") pod \"dns-default-8r6dv\" (UID: \"ffd3099d-e717-438c-a2ce-591b598cd50e\") " pod="openshift-dns/dns-default-8r6dv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831150 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69643ed9-426d-4f75-acf0-2871fd6f6f9e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5xt67\" (UID: \"69643ed9-426d-4f75-acf0-2871fd6f6f9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831170 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnf88\" (UniqueName: \"kubernetes.io/projected/63bb7490-d246-4f4b-9db9-bb254344f4c9-kube-api-access-fnf88\") pod \"migrator-59844c95c7-rzc8z\" (UID: \"63bb7490-d246-4f4b-9db9-bb254344f4c9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831190 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67a91130-56ef-4053-b062-ed2dcde04121-secret-volume\") pod \"collect-profiles-29421540-x8zxv\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831215 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdvg7\" (UniqueName: \"kubernetes.io/projected/fb600470-3773-4c44-9069-8b8aa7c18bd6-kube-api-access-sdvg7\") pod \"machine-config-controller-84d6567774-6hv2p\" (UID: \"fb600470-3773-4c44-9069-8b8aa7c18bd6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831235 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd66c484-656a-4a2e-a3e9-2b7a32ba2def-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wtqjc\" (UID: \"fd66c484-656a-4a2e-a3e9-2b7a32ba2def\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831253 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105fe85b-f861-46f6-b12b-139c5f0a7780-proxy-tls\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75qd2\" (UniqueName: \"kubernetes.io/projected/4c6c87b9-fe9c-45db-bfaa-031a208177db-kube-api-access-75qd2\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831297 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd-serving-cert\") pod \"service-ca-operator-777779d784-5rzqx\" (UID: \"32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831324 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3e9eeba2-ec44-48dd-9325-9966670acd75-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-r248q\" (UID: \"3e9eeba2-ec44-48dd-9325-9966670acd75\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831345 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4c6c87b9-fe9c-45db-bfaa-031a208177db-tmpfs\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831380 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-socket-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831409 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4da02f65-748e-42ee-82d8-4cd5445d9fab-stats-auth\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831439 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d67w\" (UniqueName: \"kubernetes.io/projected/4da02f65-748e-42ee-82d8-4cd5445d9fab-kube-api-access-2d67w\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831457 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-registration-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831475 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/105fe85b-f861-46f6-b12b-139c5f0a7780-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831548 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgchg\" (UniqueName: \"kubernetes.io/projected/381225e4-030b-401b-a1c6-8926f3a806b7-kube-api-access-tgchg\") pod \"control-plane-machine-set-operator-78cbb6b69f-w8m5z\" (UID: \"381225e4-030b-401b-a1c6-8926f3a806b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831571 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb600470-3773-4c44-9069-8b8aa7c18bd6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6hv2p\" (UID: \"fb600470-3773-4c44-9069-8b8aa7c18bd6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.831589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-csi-data-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: E1209 15:00:56.831732 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.331703475 +0000 UTC m=+136.256542103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.832316 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzghx\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-kube-api-access-jzghx\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.832346 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-bxswn"] Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.833370 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4da02f65-748e-42ee-82d8-4cd5445d9fab-service-ca-bundle\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.834156 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd66c484-656a-4a2e-a3e9-2b7a32ba2def-config\") pod \"kube-controller-manager-operator-78b949d7b-wtqjc\" (UID: \"fd66c484-656a-4a2e-a3e9-2b7a32ba2def\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.834270 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-registration-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.834376 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/105fe85b-f861-46f6-b12b-139c5f0a7780-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.835003 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ec89240-f1c2-444a-9114-5cf83fab7e9d-signing-cabundle\") pod \"service-ca-9c57cc56f-8vprh\" (UID: \"1ec89240-f1c2-444a-9114-5cf83fab7e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.835057 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rxhld\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.835498 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd-config\") pod \"service-ca-operator-777779d784-5rzqx\" (UID: \"32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.835978 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd3099d-e717-438c-a2ce-591b598cd50e-config-volume\") pod \"dns-default-8r6dv\" (UID: \"ffd3099d-e717-438c-a2ce-591b598cd50e\") " pod="openshift-dns/dns-default-8r6dv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.836085 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-socket-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.836939 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4c6c87b9-fe9c-45db-bfaa-031a208177db-tmpfs\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.837270 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69643ed9-426d-4f75-acf0-2871fd6f6f9e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5xt67\" (UID: \"69643ed9-426d-4f75-acf0-2871fd6f6f9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.837646 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb600470-3773-4c44-9069-8b8aa7c18bd6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6hv2p\" (UID: \"fb600470-3773-4c44-9069-8b8aa7c18bd6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.837864 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-mountpoint-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.838223 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c6c87b9-fe9c-45db-bfaa-031a208177db-webhook-cert\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.838260 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd-serving-cert\") pod \"service-ca-operator-777779d784-5rzqx\" (UID: \"32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.840313 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3e9eeba2-ec44-48dd-9325-9966670acd75-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-r248q\" (UID: \"3e9eeba2-ec44-48dd-9325-9966670acd75\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.841273 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/105fe85b-f861-46f6-b12b-139c5f0a7780-images\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.841563 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2be1ce89-c4a7-4237-b0f3-221ac11f813a-plugins-dir\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.842568 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67a91130-56ef-4053-b062-ed2dcde04121-config-volume\") pod \"collect-profiles-29421540-x8zxv\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.843409 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67a91130-56ef-4053-b062-ed2dcde04121-secret-volume\") pod \"collect-profiles-29421540-x8zxv\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.844834 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9348986d-f923-4e2d-86c2-6b9886b736f5-profile-collector-cert\") pod \"catalog-operator-68c6474976-xxlf7\" (UID: \"9348986d-f923-4e2d-86c2-6b9886b736f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.845766 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb600470-3773-4c44-9069-8b8aa7c18bd6-proxy-tls\") pod \"machine-config-controller-84d6567774-6hv2p\" (UID: \"fb600470-3773-4c44-9069-8b8aa7c18bd6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.846047 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4daca1a-712b-40cb-8943-303ae8542cab-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-pcdrs\" (UID: \"e4daca1a-712b-40cb-8943-303ae8542cab\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.846347 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4da02f65-748e-42ee-82d8-4cd5445d9fab-stats-auth\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.846585 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/43e42b0e-4eb2-428a-9ee6-733f90aac431-node-bootstrap-token\") pod \"machine-config-server-2hgjx\" (UID: \"43e42b0e-4eb2-428a-9ee6-733f90aac431\") " pod="openshift-machine-config-operator/machine-config-server-2hgjx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.846779 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4da02f65-748e-42ee-82d8-4cd5445d9fab-metrics-certs\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.847731 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/db303c23-142d-43ce-ba09-59581443cc4e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qwjzw\" (UID: \"db303c23-142d-43ce-ba09-59581443cc4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.848087 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4daca1a-712b-40cb-8943-303ae8542cab-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-pcdrs\" (UID: \"e4daca1a-712b-40cb-8943-303ae8542cab\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.848160 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105fe85b-f861-46f6-b12b-139c5f0a7780-proxy-tls\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.848244 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69643ed9-426d-4f75-acf0-2871fd6f6f9e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5xt67\" (UID: \"69643ed9-426d-4f75-acf0-2871fd6f6f9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.848278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/db303c23-142d-43ce-ba09-59581443cc4e-srv-cert\") pod \"olm-operator-6b444d44fb-qwjzw\" (UID: \"db303c23-142d-43ce-ba09-59581443cc4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.848650 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rxhld\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.848800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f07aeefa-cf94-4c49-ac0a-bec93a1c65f6-cert\") pod \"ingress-canary-6kh8f\" (UID: \"f07aeefa-cf94-4c49-ac0a-bec93a1c65f6\") " pod="openshift-ingress-canary/ingress-canary-6kh8f" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.848856 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4da02f65-748e-42ee-82d8-4cd5445d9fab-default-certificate\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.849030 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ffd3099d-e717-438c-a2ce-591b598cd50e-metrics-tls\") pod \"dns-default-8r6dv\" (UID: \"ffd3099d-e717-438c-a2ce-591b598cd50e\") " pod="openshift-dns/dns-default-8r6dv" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.849459 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbxs\" (UniqueName: \"kubernetes.io/projected/bfe12755-b370-474e-b856-82522f9b38d0-kube-api-access-6vbxs\") pod \"console-f9d7485db-c4mlr\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.850091 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c6c87b9-fe9c-45db-bfaa-031a208177db-apiservice-cert\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.852133 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/43e42b0e-4eb2-428a-9ee6-733f90aac431-certs\") pod \"machine-config-server-2hgjx\" (UID: \"43e42b0e-4eb2-428a-9ee6-733f90aac431\") " pod="openshift-machine-config-operator/machine-config-server-2hgjx" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.854225 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/381225e4-030b-401b-a1c6-8926f3a806b7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w8m5z\" (UID: \"381225e4-030b-401b-a1c6-8926f3a806b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.854224 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/45325644-f6e0-48dd-ac6f-05d02cbee704-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vsnk7\" (UID: \"45325644-f6e0-48dd-ac6f-05d02cbee704\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.854816 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ec89240-f1c2-444a-9114-5cf83fab7e9d-signing-key\") pod \"service-ca-9c57cc56f-8vprh\" (UID: \"1ec89240-f1c2-444a-9114-5cf83fab7e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.856044 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd66c484-656a-4a2e-a3e9-2b7a32ba2def-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wtqjc\" (UID: \"fd66c484-656a-4a2e-a3e9-2b7a32ba2def\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.859555 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9348986d-f923-4e2d-86c2-6b9886b736f5-srv-cert\") pod \"catalog-operator-68c6474976-xxlf7\" (UID: \"9348986d-f923-4e2d-86c2-6b9886b736f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.874830 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-56tt8"] Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.896467 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.898085 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7j4x\" (UniqueName: \"kubernetes.io/projected/525bce41-3834-48ac-a687-ce995171d333-kube-api-access-b7j4x\") pod \"cluster-image-registry-operator-dc59b4c8b-q2cst\" (UID: \"525bce41-3834-48ac-a687-ce995171d333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.902062 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.914740 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nfp8\" (UniqueName: \"kubernetes.io/projected/cd154287-dca7-45d0-bd79-af3c4a117793-kube-api-access-5nfp8\") pod \"openshift-config-operator-7777fb866f-mr9zt\" (UID: \"cd154287-dca7-45d0-bd79-af3c4a117793\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.916107 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs"] Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.921315 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:56 crc kubenswrapper[4735]: W1209 15:00:56.925858 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eefb88b_418a_4287_9b76_7e4a54d1a461.slice/crio-21dec7e08cede502de85198d1b04de7f26fddff0d9c48bc0458317458fc14273 WatchSource:0}: Error finding container 21dec7e08cede502de85198d1b04de7f26fddff0d9c48bc0458317458fc14273: Status 404 returned error can't find the container with id 21dec7e08cede502de85198d1b04de7f26fddff0d9c48bc0458317458fc14273 Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.931613 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzrwj\" (UniqueName: \"kubernetes.io/projected/2ac48d56-9f89-48f7-8840-48d2761beb97-kube-api-access-jzrwj\") pod \"route-controller-manager-6576b87f9c-sqm8s\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.932789 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:56 crc kubenswrapper[4735]: E1209 15:00:56.933109 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.43309252 +0000 UTC m=+136.357931148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.939759 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.952408 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc854\" (UniqueName: \"kubernetes.io/projected/3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf-kube-api-access-sc854\") pod \"apiserver-7bbb656c7d-cgl2q\" (UID: \"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.954649 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" event={"ID":"df271b84-5513-4840-9faa-9b66e4fd3487","Type":"ContainerStarted","Data":"ea2ec22c4409b0781808040f0e1b1f98b1e1bd4473220f40a65f392eb7f8b044"} Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.957057 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-56tt8" event={"ID":"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a","Type":"ContainerStarted","Data":"42051eeafb1297250a9ea5d21d3b4cd2481c258ea2fc3b04233507f4662a6193"} Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.972660 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d5fv\" (UniqueName: \"kubernetes.io/projected/f37c5317-cf9b-44be-a65d-982bbe6a0473-kube-api-access-2d5fv\") pod \"ingress-operator-5b745b69d9-x6b7z\" (UID: \"f37c5317-cf9b-44be-a65d-982bbe6a0473\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.973499 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" event={"ID":"0a42c9c7-4bfa-4972-baa1-eb2076d9c5be","Type":"ContainerStarted","Data":"aeccb3e8f1fa7c03590e933ef045653a4fb7e5a20c4d5570ad897448d23d6810"} Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.973554 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" event={"ID":"0a42c9c7-4bfa-4972-baa1-eb2076d9c5be","Type":"ContainerStarted","Data":"8c60f29ea347f9cba5a1ea3c6f2e6136677bcb28113fe7ccb34190ea9790c6c6"} Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.989617 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" event={"ID":"8eefb88b-418a-4287-9b76-7e4a54d1a461","Type":"ContainerStarted","Data":"21dec7e08cede502de85198d1b04de7f26fddff0d9c48bc0458317458fc14273"} Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.994216 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" event={"ID":"3a186d2c-8f55-4033-9154-b4ff929c9a98","Type":"ContainerStarted","Data":"13ae914f3be31737420dd216fa9052d6ef3f4d39c11e382f784643b86da2eb31"} Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.994270 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" event={"ID":"3a186d2c-8f55-4033-9154-b4ff929c9a98","Type":"ContainerStarted","Data":"cef320f6a43d9674936770b78d1e3ad6090a504a42809371c2d164a80b7d1ba3"} Dec 09 15:00:56 crc kubenswrapper[4735]: I1209 15:00:56.995131 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53061c50-1a0f-4496-a734-7a8d27f65fe6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rkptn\" (UID: \"53061c50-1a0f-4496-a734-7a8d27f65fe6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.006591 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.016670 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnpxt\" (UniqueName: \"kubernetes.io/projected/4dee35e0-d22b-4610-90a0-9dd076a8b6a5-kube-api-access-fnpxt\") pod \"authentication-operator-69f744f599-jgd5m\" (UID: \"4dee35e0-d22b-4610-90a0-9dd076a8b6a5\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.021271 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.028917 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.034048 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.034665 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.534644835 +0000 UTC m=+136.459483463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.035014 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.035533 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.535525086 +0000 UTC m=+136.460363714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.037355 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsw9l\" (UniqueName: \"kubernetes.io/projected/10df02c0-bbd4-4021-acf6-311c2186ff9e-kube-api-access-fsw9l\") pod \"oauth-openshift-558db77b4-mxscb\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.053960 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdmf\" (UniqueName: \"kubernetes.io/projected/b835f641-1777-4869-8ae7-161e8f528229-kube-api-access-qrdmf\") pod \"downloads-7954f5f757-r2xnv\" (UID: \"b835f641-1777-4869-8ae7-161e8f528229\") " pod="openshift-console/downloads-7954f5f757-r2xnv" Dec 09 15:00:57 crc kubenswrapper[4735]: W1209 15:00:57.060462 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8522db03_eed8_439b_a1bd_afe0b724a615.slice/crio-29ce315f33fd762d990d48c15963e243a984bd67cb6f5a76ea63483d7d1660f2 WatchSource:0}: Error finding container 29ce315f33fd762d990d48c15963e243a984bd67cb6f5a76ea63483d7d1660f2: Status 404 returned error can't find the container with id 29ce315f33fd762d990d48c15963e243a984bd67cb6f5a76ea63483d7d1660f2 Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.079110 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwsr\" (UniqueName: \"kubernetes.io/projected/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-kube-api-access-dqwsr\") pod \"controller-manager-879f6c89f-fpbzm\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.091790 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzvpb\" (UniqueName: \"kubernetes.io/projected/e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf-kube-api-access-kzvpb\") pod \"dns-operator-744455d44c-p77wx\" (UID: \"e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf\") " pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.105019 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.106807 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-c4mlr"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.117605 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.127070 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8cfq\" (UniqueName: \"kubernetes.io/projected/da912d07-0a05-4d1c-b042-82d8a3b23467-kube-api-access-b8cfq\") pod \"apiserver-76f77b778f-t5fvv\" (UID: \"da912d07-0a05-4d1c-b042-82d8a3b23467\") " pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.134392 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhcm\" (UniqueName: \"kubernetes.io/projected/1ec89240-f1c2-444a-9114-5cf83fab7e9d-kube-api-access-dlhcm\") pod \"service-ca-9c57cc56f-8vprh\" (UID: \"1ec89240-f1c2-444a-9114-5cf83fab7e9d\") " pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.136133 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.136588 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.636568461 +0000 UTC m=+136.561407088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.136719 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.137453 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.637439573 +0000 UTC m=+136.562278201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: W1209 15:00:57.138598 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfe12755_b370_474e_b856_82522f9b38d0.slice/crio-9818553b5a097eac8b42f22ae22b0f4b0a6be16a4abec3dcf7163e3668061a99 WatchSource:0}: Error finding container 9818553b5a097eac8b42f22ae22b0f4b0a6be16a4abec3dcf7163e3668061a99: Status 404 returned error can't find the container with id 9818553b5a097eac8b42f22ae22b0f4b0a6be16a4abec3dcf7163e3668061a99 Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.155829 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b85cr\" (UniqueName: \"kubernetes.io/projected/fc05d12f-eaa9-48ae-b280-e449caed078c-kube-api-access-b85cr\") pod \"marketplace-operator-79b997595-rxhld\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.167965 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.170390 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svkn4\" (UniqueName: \"kubernetes.io/projected/105fe85b-f861-46f6-b12b-139c5f0a7780-kube-api-access-svkn4\") pod \"machine-config-operator-74547568cd-ctqxr\" (UID: \"105fe85b-f861-46f6-b12b-139c5f0a7780\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.175067 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:00:57 crc kubenswrapper[4735]: W1209 15:00:57.182155 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd154287_dca7_45d0_bd79_af3c4a117793.slice/crio-6b91cf2bb0ce7542f4b68bae18b33f1ba168ab0924afa3880a08b7de783809d2 WatchSource:0}: Error finding container 6b91cf2bb0ce7542f4b68bae18b33f1ba168ab0924afa3880a08b7de783809d2: Status 404 returned error can't find the container with id 6b91cf2bb0ce7542f4b68bae18b33f1ba168ab0924afa3880a08b7de783809d2 Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.184440 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.190904 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhdtp\" (UniqueName: \"kubernetes.io/projected/3e9eeba2-ec44-48dd-9325-9966670acd75-kube-api-access-fhdtp\") pod \"multus-admission-controller-857f4d67dd-r248q\" (UID: \"3e9eeba2-ec44-48dd-9325-9966670acd75\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.210838 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.213481 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdvg7\" (UniqueName: \"kubernetes.io/projected/fb600470-3773-4c44-9069-8b8aa7c18bd6-kube-api-access-sdvg7\") pod \"machine-config-controller-84d6567774-6hv2p\" (UID: \"fb600470-3773-4c44-9069-8b8aa7c18bd6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.233458 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.236315 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd66c484-656a-4a2e-a3e9-2b7a32ba2def-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wtqjc\" (UID: \"fd66c484-656a-4a2e-a3e9-2b7a32ba2def\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.237841 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.238577 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.738562188 +0000 UTC m=+136.663400817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.251720 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.257963 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d67w\" (UniqueName: \"kubernetes.io/projected/4da02f65-748e-42ee-82d8-4cd5445d9fab-kube-api-access-2d67w\") pod \"router-default-5444994796-n8ljm\" (UID: \"4da02f65-748e-42ee-82d8-4cd5445d9fab\") " pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.258113 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.267754 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.272049 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgchg\" (UniqueName: \"kubernetes.io/projected/381225e4-030b-401b-a1c6-8926f3a806b7-kube-api-access-tgchg\") pod \"control-plane-machine-set-operator-78cbb6b69f-w8m5z\" (UID: \"381225e4-030b-401b-a1c6-8926f3a806b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.272903 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn"] Dec 09 15:00:57 crc kubenswrapper[4735]: W1209 15:00:57.290188 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac48d56_9f89_48f7_8840_48d2761beb97.slice/crio-3d4f73c95edd9049f96b942b6f2c905437e519a397c55f691ad84d09113c3160 WatchSource:0}: Error finding container 3d4f73c95edd9049f96b942b6f2c905437e519a397c55f691ad84d09113c3160: Status 404 returned error can't find the container with id 3d4f73c95edd9049f96b942b6f2c905437e519a397c55f691ad84d09113c3160 Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.307550 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7nb4\" (UniqueName: \"kubernetes.io/projected/43e42b0e-4eb2-428a-9ee6-733f90aac431-kube-api-access-s7nb4\") pod \"machine-config-server-2hgjx\" (UID: \"43e42b0e-4eb2-428a-9ee6-733f90aac431\") " pod="openshift-machine-config-operator/machine-config-server-2hgjx" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.309070 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75qd2\" (UniqueName: \"kubernetes.io/projected/4c6c87b9-fe9c-45db-bfaa-031a208177db-kube-api-access-75qd2\") pod \"packageserver-d55dfcdfc-tbgl7\" (UID: \"4c6c87b9-fe9c-45db-bfaa-031a208177db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.309202 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-r2xnv" Dec 09 15:00:57 crc kubenswrapper[4735]: W1209 15:00:57.319568 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53061c50_1a0f_4496_a734_7a8d27f65fe6.slice/crio-740258dd76fdf55af060ad68f6dfcbaf85c996a98f39656e3e252600049fe619 WatchSource:0}: Error finding container 740258dd76fdf55af060ad68f6dfcbaf85c996a98f39656e3e252600049fe619: Status 404 returned error can't find the container with id 740258dd76fdf55af060ad68f6dfcbaf85c996a98f39656e3e252600049fe619 Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.329073 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9tcw\" (UniqueName: \"kubernetes.io/projected/e4daca1a-712b-40cb-8943-303ae8542cab-kube-api-access-m9tcw\") pod \"kube-storage-version-migrator-operator-b67b599dd-pcdrs\" (UID: \"e4daca1a-712b-40cb-8943-303ae8542cab\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.340844 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.341167 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.341714 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.841692095 +0000 UTC m=+136.766530723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.365708 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.365878 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.370617 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69643ed9-426d-4f75-acf0-2871fd6f6f9e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5xt67\" (UID: \"69643ed9-426d-4f75-acf0-2871fd6f6f9e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.370700 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.372257 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkp69\" (UniqueName: \"kubernetes.io/projected/f07aeefa-cf94-4c49-ac0a-bec93a1c65f6-kube-api-access-nkp69\") pod \"ingress-canary-6kh8f\" (UID: \"f07aeefa-cf94-4c49-ac0a-bec93a1c65f6\") " pod="openshift-ingress-canary/ingress-canary-6kh8f" Dec 09 15:00:57 crc kubenswrapper[4735]: W1209 15:00:57.382234 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4da02f65_748e_42ee_82d8_4cd5445d9fab.slice/crio-44411d5792bfbc9234f9136e91216d08aab6ab5109fd1f469997eb4d09afb270 WatchSource:0}: Error finding container 44411d5792bfbc9234f9136e91216d08aab6ab5109fd1f469997eb4d09afb270: Status 404 returned error can't find the container with id 44411d5792bfbc9234f9136e91216d08aab6ab5109fd1f469997eb4d09afb270 Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.389707 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.392478 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jgd5m"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.396808 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.397616 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2kh5\" (UniqueName: \"kubernetes.io/projected/ffd3099d-e717-438c-a2ce-591b598cd50e-kube-api-access-n2kh5\") pod \"dns-default-8r6dv\" (UID: \"ffd3099d-e717-438c-a2ce-591b598cd50e\") " pod="openshift-dns/dns-default-8r6dv" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.409643 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.410566 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.411214 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnf88\" (UniqueName: \"kubernetes.io/projected/63bb7490-d246-4f4b-9db9-bb254344f4c9-kube-api-access-fnf88\") pod \"migrator-59844c95c7-rzc8z\" (UID: \"63bb7490-d246-4f4b-9db9-bb254344f4c9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.423045 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.425106 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.431214 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.438823 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x2x6\" (UniqueName: \"kubernetes.io/projected/2be1ce89-c4a7-4237-b0f3-221ac11f813a-kube-api-access-5x2x6\") pod \"csi-hostpathplugin-ms5pn\" (UID: \"2be1ce89-c4a7-4237-b0f3-221ac11f813a\") " pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.442772 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2hgjx" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.445632 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fpbzm"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.445702 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.446630 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6kh8f" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.446749 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.447193 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:57.947176096 +0000 UTC m=+136.872014724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.475775 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.475893 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8r6dv" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.484868 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45p6k\" (UniqueName: \"kubernetes.io/projected/32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd-kube-api-access-45p6k\") pod \"service-ca-operator-777779d784-5rzqx\" (UID: \"32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.486839 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45gql\" (UniqueName: \"kubernetes.io/projected/67a91130-56ef-4053-b062-ed2dcde04121-kube-api-access-45gql\") pod \"collect-profiles-29421540-x8zxv\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.492569 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crq62\" (UniqueName: \"kubernetes.io/projected/45325644-f6e0-48dd-ac6f-05d02cbee704-kube-api-access-crq62\") pod \"package-server-manager-789f6589d5-vsnk7\" (UID: \"45325644-f6e0-48dd-ac6f-05d02cbee704\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" Dec 09 15:00:57 crc kubenswrapper[4735]: W1209 15:00:57.502916 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dee35e0_d22b_4610_90a0_9dd076a8b6a5.slice/crio-bc80f195845402711e27c645b482c7b730b789284987be5c31ed729fb2736592 WatchSource:0}: Error finding container bc80f195845402711e27c645b482c7b730b789284987be5c31ed729fb2736592: Status 404 returned error can't find the container with id bc80f195845402711e27c645b482c7b730b789284987be5c31ed729fb2736592 Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.537614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfgqw\" (UniqueName: \"kubernetes.io/projected/9348986d-f923-4e2d-86c2-6b9886b736f5-kube-api-access-cfgqw\") pod \"catalog-operator-68c6474976-xxlf7\" (UID: \"9348986d-f923-4e2d-86c2-6b9886b736f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.548110 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.549219 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.049207877 +0000 UTC m=+136.974046505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.552298 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66526\" (UniqueName: \"kubernetes.io/projected/db303c23-142d-43ce-ba09-59581443cc4e-kube-api-access-66526\") pod \"olm-operator-6b444d44fb-qwjzw\" (UID: \"db303c23-142d-43ce-ba09-59581443cc4e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.615071 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.630649 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxscb"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.634479 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.648052 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.648705 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.648995 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.148975235 +0000 UTC m=+137.073813864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.649090 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.649426 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.149413532 +0000 UTC m=+137.074252160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.652382 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.680291 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.680772 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.682193 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:00:57 crc kubenswrapper[4735]: W1209 15:00:57.687626 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf37c5317_cf9b_44be_a65d_982bbe6a0473.slice/crio-7c57c836f32e79893b54505b74854594ddd15a856d7faec187ea55b9c8901732 WatchSource:0}: Error finding container 7c57c836f32e79893b54505b74854594ddd15a856d7faec187ea55b9c8901732: Status 404 returned error can't find the container with id 7c57c836f32e79893b54505b74854594ddd15a856d7faec187ea55b9c8901732 Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.688773 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.733959 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.750075 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.750537 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.250508885 +0000 UTC m=+137.175347504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.763350 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr"] Dec 09 15:00:57 crc kubenswrapper[4735]: W1209 15:00:57.788711 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10df02c0_bbd4_4021_acf6_311c2186ff9e.slice/crio-af1539b531f871d2326ee3e01fffa5bbae2807aee438404a33d7da3d5f703da9 WatchSource:0}: Error finding container af1539b531f871d2326ee3e01fffa5bbae2807aee438404a33d7da3d5f703da9: Status 404 returned error can't find the container with id af1539b531f871d2326ee3e01fffa5bbae2807aee438404a33d7da3d5f703da9 Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.851316 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:57 crc kubenswrapper[4735]: W1209 15:00:57.851672 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod105fe85b_f861_46f6_b12b_139c5f0a7780.slice/crio-32001eb1a39841dbf7fb6c9eac3676aeba61735c98774ca73f1eae7c224df584 WatchSource:0}: Error finding container 32001eb1a39841dbf7fb6c9eac3676aeba61735c98774ca73f1eae7c224df584: Status 404 returned error can't find the container with id 32001eb1a39841dbf7fb6c9eac3676aeba61735c98774ca73f1eae7c224df584 Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.852635 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.352620018 +0000 UTC m=+137.277458646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.920016 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-r2xnv"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.931638 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p77wx"] Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.952447 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.952875 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.452820343 +0000 UTC m=+137.377658971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:57 crc kubenswrapper[4735]: I1209 15:00:57.953050 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:57 crc kubenswrapper[4735]: E1209 15:00:57.953672 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.453665076 +0000 UTC m=+137.378503703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.008956 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-56tt8" event={"ID":"d75ebf60-0807-4fb2-a38b-ab3dc0e8793a","Type":"ContainerStarted","Data":"f43a8e8d6bba030cd2065da5551f0c2707e3034fafaf1772087ee66dc2fc414c"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.009381 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.011597 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" event={"ID":"2ac48d56-9f89-48f7-8840-48d2761beb97","Type":"ContainerStarted","Data":"f39fdf014facbf2dc8816e1be6d0b7670172a884b0b8349e58e50f8f19b9d19a"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.011629 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" event={"ID":"2ac48d56-9f89-48f7-8840-48d2761beb97","Type":"ContainerStarted","Data":"3d4f73c95edd9049f96b942b6f2c905437e519a397c55f691ad84d09113c3160"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.019044 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.034812 4735 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-sqm8s container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.034853 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" podUID="2ac48d56-9f89-48f7-8840-48d2761beb97" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.045479 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2hgjx" event={"ID":"43e42b0e-4eb2-428a-9ee6-733f90aac431","Type":"ContainerStarted","Data":"12d7715e0bce877b965cab174d4e72f6ba7d0521bb3718220488d6397ebef136"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.050278 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" event={"ID":"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf","Type":"ContainerStarted","Data":"b51df1f43a7e9b97fbd5d3ae91b1bbbc4466ceaf6166b57370c7c38425762ddb"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.053804 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.054026 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.553992763 +0000 UTC m=+137.478831392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.054278 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.054564 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.554551019 +0000 UTC m=+137.479389648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.060203 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" event={"ID":"8522db03-eed8-439b-a1bd-afe0b724a615","Type":"ContainerStarted","Data":"3a9a59dbd6af1a69259293778148c92b2e2eed65a2bba2bbf79d73971bbb6a6d"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.060252 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" event={"ID":"8522db03-eed8-439b-a1bd-afe0b724a615","Type":"ContainerStarted","Data":"29ce315f33fd762d990d48c15963e243a984bd67cb6f5a76ea63483d7d1660f2"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.141980 4735 generic.go:334] "Generic (PLEG): container finished" podID="cd154287-dca7-45d0-bd79-af3c4a117793" containerID="dd31a467401c34cbf0fcdcd00c3eeb1a08cedd175a22cb96fc0f49f23ebba946" exitCode=0 Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.142145 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" event={"ID":"cd154287-dca7-45d0-bd79-af3c4a117793","Type":"ContainerDied","Data":"dd31a467401c34cbf0fcdcd00c3eeb1a08cedd175a22cb96fc0f49f23ebba946"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.142204 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" event={"ID":"cd154287-dca7-45d0-bd79-af3c4a117793","Type":"ContainerStarted","Data":"6b91cf2bb0ce7542f4b68bae18b33f1ba168ab0924afa3880a08b7de783809d2"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.151607 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-r248q"] Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.153210 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" event={"ID":"3a186d2c-8f55-4033-9154-b4ff929c9a98","Type":"ContainerStarted","Data":"853a3ef513a19bad5ed455fa6c9af2cd33e6acecd30ccddbbece9b119973f720"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.155312 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.155435 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.655407678 +0000 UTC m=+137.580246306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.155958 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.156874 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.656854289 +0000 UTC m=+137.581692917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.159678 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" event={"ID":"df271b84-5513-4840-9faa-9b66e4fd3487","Type":"ContainerStarted","Data":"6a531ea684f31f2aa513d1fd86eeeea55888baae366ee5706da7e64f498e3a5e"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.166736 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" event={"ID":"b28eb5dd-5f60-4d02-8972-99cba02cb1c8","Type":"ContainerStarted","Data":"bfe78a9e505506724f7301ecebf4f46a74bb9a5314396715b0e173dbca1f3440"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.168590 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6kh8f"] Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.179123 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" event={"ID":"10df02c0-bbd4-4021-acf6-311c2186ff9e","Type":"ContainerStarted","Data":"af1539b531f871d2326ee3e01fffa5bbae2807aee438404a33d7da3d5f703da9"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.183805 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-n8ljm" event={"ID":"4da02f65-748e-42ee-82d8-4cd5445d9fab","Type":"ContainerStarted","Data":"b4eab90dffb71684289bf33d4ceb01b9a95980c0ca4e3f08c027dfe3329cf6cc"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.183859 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-n8ljm" event={"ID":"4da02f65-748e-42ee-82d8-4cd5445d9fab","Type":"ContainerStarted","Data":"44411d5792bfbc9234f9136e91216d08aab6ab5109fd1f469997eb4d09afb270"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.186190 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" event={"ID":"53061c50-1a0f-4496-a734-7a8d27f65fe6","Type":"ContainerStarted","Data":"4b4dd0d3466f8689a3156352bbc2c4bcf0ce8c0fd3b335e069cc089b2a05685e"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.186223 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" event={"ID":"53061c50-1a0f-4496-a734-7a8d27f65fe6","Type":"ContainerStarted","Data":"740258dd76fdf55af060ad68f6dfcbaf85c996a98f39656e3e252600049fe619"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.209037 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c4mlr" event={"ID":"bfe12755-b370-474e-b856-82522f9b38d0","Type":"ContainerStarted","Data":"f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.209298 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c4mlr" event={"ID":"bfe12755-b370-474e-b856-82522f9b38d0","Type":"ContainerStarted","Data":"9818553b5a097eac8b42f22ae22b0f4b0a6be16a4abec3dcf7163e3668061a99"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.211559 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" event={"ID":"4dee35e0-d22b-4610-90a0-9dd076a8b6a5","Type":"ContainerStarted","Data":"bc80f195845402711e27c645b482c7b730b789284987be5c31ed729fb2736592"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.213317 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" event={"ID":"105fe85b-f861-46f6-b12b-139c5f0a7780","Type":"ContainerStarted","Data":"32001eb1a39841dbf7fb6c9eac3676aeba61735c98774ca73f1eae7c224df584"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.215846 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" event={"ID":"8eefb88b-418a-4287-9b76-7e4a54d1a461","Type":"ContainerStarted","Data":"2e5859b95ab0901354d24c9f5c3c5fcbec988261056506099586c6992b649641"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.220114 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" event={"ID":"525bce41-3834-48ac-a687-ce995171d333","Type":"ContainerStarted","Data":"3b250e8692931ba301a52c1b4bd0026aadaae2694704947aaa1e6a5a67d9a140"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.220155 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" event={"ID":"525bce41-3834-48ac-a687-ce995171d333","Type":"ContainerStarted","Data":"6849632b4a98332fb9d8bcb677f82c0e68d40898e781f6eddbcdea4b094449f0"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.222370 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" event={"ID":"5a589e0e-3989-407f-a8c3-5b2391bddc09","Type":"ContainerStarted","Data":"876b8e1ab87de7cb069073fad72f721fdaba43342064e2e581b9f9afcde0b30d"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.222402 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" event={"ID":"5a589e0e-3989-407f-a8c3-5b2391bddc09","Type":"ContainerStarted","Data":"cf7da1b13138049727b97a2a91dd0b4206889cc6b770a50e93390f5823546580"} Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.226246 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" event={"ID":"f37c5317-cf9b-44be-a65d-982bbe6a0473","Type":"ContainerStarted","Data":"7c57c836f32e79893b54505b74854594ddd15a856d7faec187ea55b9c8901732"} Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.261946 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.761918738 +0000 UTC m=+137.686757366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.261981 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.265057 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.269407 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.769382575 +0000 UTC m=+137.694221203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.313241 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxhld"] Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.315254 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t5fvv"] Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.315577 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-56tt8" Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.342353 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.371041 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.371973 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.871955448 +0000 UTC m=+137.796794076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.374379 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.375235 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.875204019 +0000 UTC m=+137.800042648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.478480 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.478787 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:58.978769117 +0000 UTC m=+137.903607744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.483707 4735 patch_prober.go:28] interesting pod/router-default-5444994796-n8ljm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:00:58 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Dec 09 15:00:58 crc kubenswrapper[4735]: [+]process-running ok Dec 09 15:00:58 crc kubenswrapper[4735]: healthz check failed Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.483752 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n8ljm" podUID="4da02f65-748e-42ee-82d8-4cd5445d9fab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.580445 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.581899 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:59.081885888 +0000 UTC m=+138.006724516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.684246 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.684770 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:59.184757553 +0000 UTC m=+138.109596181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.695020 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zrdkv" podStartSLOduration=120.695010423 podStartE2EDuration="2m0.695010423s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:58.694056231 +0000 UTC m=+137.618894859" watchObservedRunningTime="2025-12-09 15:00:58.695010423 +0000 UTC m=+137.619849050" Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.732526 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" podStartSLOduration=119.732486985 podStartE2EDuration="1m59.732486985s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:58.731886158 +0000 UTC m=+137.656724786" watchObservedRunningTime="2025-12-09 15:00:58.732486985 +0000 UTC m=+137.657325613" Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.788323 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.788908 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:59.288892258 +0000 UTC m=+138.213730886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.815576 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7"] Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.866122 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc"] Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.871342 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs"] Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.875509 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-n8ljm" podStartSLOduration=119.875431318 podStartE2EDuration="1m59.875431318s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:58.866083555 +0000 UTC m=+137.790922182" watchObservedRunningTime="2025-12-09 15:00:58.875431318 +0000 UTC m=+137.800269945" Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.875977 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ms5pn"] Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.891325 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.891430 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:59.391411018 +0000 UTC m=+138.316249645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.892003 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.892301 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:59.392292971 +0000 UTC m=+138.317131600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.945126 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-c4mlr" podStartSLOduration=120.945074207 podStartE2EDuration="2m0.945074207s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:58.937796425 +0000 UTC m=+137.862635053" watchObservedRunningTime="2025-12-09 15:00:58.945074207 +0000 UTC m=+137.869912834" Dec 09 15:00:58 crc kubenswrapper[4735]: I1209 15:00:58.994097 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:58 crc kubenswrapper[4735]: E1209 15:00:58.994491 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:59.49446469 +0000 UTC m=+138.419303318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.047852 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rkptn" podStartSLOduration=120.04783057 podStartE2EDuration="2m0.04783057s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.044448295 +0000 UTC m=+137.969286924" watchObservedRunningTime="2025-12-09 15:00:59.04783057 +0000 UTC m=+137.972669198" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.048259 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-r4hp5" podStartSLOduration=120.048253147 podStartE2EDuration="2m0.048253147s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.01541109 +0000 UTC m=+137.940249718" watchObservedRunningTime="2025-12-09 15:00:59.048253147 +0000 UTC m=+137.973091775" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.095116 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:59 crc kubenswrapper[4735]: E1209 15:00:59.095400 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:59.595391001 +0000 UTC m=+138.520229628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.175056 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-56tt8" podStartSLOduration=121.175038486 podStartE2EDuration="2m1.175038486s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.137096054 +0000 UTC m=+138.061934683" watchObservedRunningTime="2025-12-09 15:00:59.175038486 +0000 UTC m=+138.099877114" Dec 09 15:00:59 crc kubenswrapper[4735]: E1209 15:00:59.201001 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:59.700985131 +0000 UTC m=+138.625823759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.200922 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.201308 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:59 crc kubenswrapper[4735]: E1209 15:00:59.201667 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:59.701660601 +0000 UTC m=+138.626499228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.237634 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8r6dv"] Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.241316 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z"] Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.245987 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8vprh"] Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.247022 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q2cst" podStartSLOduration=121.247013877 podStartE2EDuration="2m1.247013877s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.23847531 +0000 UTC m=+138.163313937" watchObservedRunningTime="2025-12-09 15:00:59.247013877 +0000 UTC m=+138.171852505" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.247597 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7"] Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.251211 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p"] Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.284062 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67"] Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.304148 4735 generic.go:334] "Generic (PLEG): container finished" podID="da912d07-0a05-4d1c-b042-82d8a3b23467" containerID="1d5ab6821d40e951dd1c19e4b3c326d82ea4fb725abb6375926f80e69539cefd" exitCode=0 Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.304442 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" event={"ID":"da912d07-0a05-4d1c-b042-82d8a3b23467","Type":"ContainerDied","Data":"1d5ab6821d40e951dd1c19e4b3c326d82ea4fb725abb6375926f80e69539cefd"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.304465 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" event={"ID":"da912d07-0a05-4d1c-b042-82d8a3b23467","Type":"ContainerStarted","Data":"e2e26f81b1c62bbb3e7ccbea6366e6f6922a25818c7543be45332816cc467eb1"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.305341 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:59 crc kubenswrapper[4735]: E1209 15:00:59.307152 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:00:59.807136455 +0000 UTC m=+138.731975083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.331587 4735 generic.go:334] "Generic (PLEG): container finished" podID="3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf" containerID="35ae0cb9c7c9ad36ea2927acb2347e74ebf8e53c9202a72a12845a48e8595af5" exitCode=0 Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.331642 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" event={"ID":"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf","Type":"ContainerDied","Data":"35ae0cb9c7c9ad36ea2927acb2347e74ebf8e53c9202a72a12845a48e8595af5"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.343298 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" event={"ID":"2be1ce89-c4a7-4237-b0f3-221ac11f813a","Type":"ContainerStarted","Data":"9604107e8431c27770b1d9ee11409ecce815ae9862594d5349f8ea413110ad17"} Dec 09 15:00:59 crc kubenswrapper[4735]: W1209 15:00:59.347790 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffd3099d_e717_438c_a2ce_591b598cd50e.slice/crio-7a46b1830e3465ab28004935c068801d44ec19d0f5a17c9a72960121f9250081 WatchSource:0}: Error finding container 7a46b1830e3465ab28004935c068801d44ec19d0f5a17c9a72960121f9250081: Status 404 returned error can't find the container with id 7a46b1830e3465ab28004935c068801d44ec19d0f5a17c9a72960121f9250081 Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.349207 4735 patch_prober.go:28] interesting pod/router-default-5444994796-n8ljm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:00:59 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Dec 09 15:00:59 crc kubenswrapper[4735]: [+]process-running ok Dec 09 15:00:59 crc kubenswrapper[4735]: healthz check failed Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.349238 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n8ljm" podUID="4da02f65-748e-42ee-82d8-4cd5445d9fab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.350190 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" event={"ID":"cd154287-dca7-45d0-bd79-af3c4a117793","Type":"ContainerStarted","Data":"3441e42ab5d490b0f044462417d0ee769c2b4ac0e975d0dc01c2692771f2c317"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.350815 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.359391 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-bxswn" podStartSLOduration=120.359378772 podStartE2EDuration="2m0.359378772s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.358557554 +0000 UTC m=+138.283396181" watchObservedRunningTime="2025-12-09 15:00:59.359378772 +0000 UTC m=+138.284217400" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.397128 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx"] Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.398266 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw"] Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.415364 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7"] Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.419003 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:59 crc kubenswrapper[4735]: E1209 15:00:59.421152 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:00:59.921142251 +0000 UTC m=+138.845980879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.446722 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" event={"ID":"8522db03-eed8-439b-a1bd-afe0b724a615","Type":"ContainerStarted","Data":"1a0d9f029611884359e9efccf84c979404af67e79a281109e7d08379959eb60b"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.451103 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bbnbs" podStartSLOduration=120.451085264 podStartE2EDuration="2m0.451085264s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.450817584 +0000 UTC m=+138.375656212" watchObservedRunningTime="2025-12-09 15:00:59.451085264 +0000 UTC m=+138.375923893" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.453397 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-r2xnv" event={"ID":"b835f641-1777-4869-8ae7-161e8f528229","Type":"ContainerStarted","Data":"99f545222203b6d3024e7c8f908065c62046212346919118aaf095e5269012c7"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.453438 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-r2xnv" event={"ID":"b835f641-1777-4869-8ae7-161e8f528229","Type":"ContainerStarted","Data":"63185e5ad5fc3acef2a231e004fe3e60f445ae45e366005b9415daf7f5d2c1a5"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.454016 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-r2xnv" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.461808 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2hgjx" event={"ID":"43e42b0e-4eb2-428a-9ee6-733f90aac431","Type":"ContainerStarted","Data":"4cfa93ca0911b084aa3e542e44edd16521bc00a0bf1fa19af99642cb53713567"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.464377 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z"] Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.469680 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-r2xnv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.469730 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r2xnv" podUID="b835f641-1777-4869-8ae7-161e8f528229" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.490323 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" event={"ID":"b28eb5dd-5f60-4d02-8972-99cba02cb1c8","Type":"ContainerStarted","Data":"416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.490982 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.507656 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv"] Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.513336 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.520270 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.550771 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" event={"ID":"e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf","Type":"ContainerStarted","Data":"554cc22b2252048029972217ff0005acf70eea4061ad246557e154d5fee9c3ba"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.550819 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" event={"ID":"e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf","Type":"ContainerStarted","Data":"e8b33da11478bad2caf8a327f7d9c8314d0a9185957b1f24e6f669b159232e89"} Dec 09 15:00:59 crc kubenswrapper[4735]: E1209 15:00:59.552899 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.052862008 +0000 UTC m=+138.977700637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.558852 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" event={"ID":"10df02c0-bbd4-4021-acf6-311c2186ff9e","Type":"ContainerStarted","Data":"ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.559687 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.563790 4735 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-mxscb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.563849 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" podUID="10df02c0-bbd4-4021-acf6-311c2186ff9e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.576593 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" event={"ID":"3e9eeba2-ec44-48dd-9325-9966670acd75","Type":"ContainerStarted","Data":"69c558b7deb307a764873007ad2cae84a334c5528b3869cb4aa5cd8acbb29412"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.576637 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" event={"ID":"3e9eeba2-ec44-48dd-9325-9966670acd75","Type":"ContainerStarted","Data":"80fb8e301fb4dd196f745e2fb5f31c40566b8092f5c8f184ca1ccd1358fc7fcf"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.578800 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" event={"ID":"e4daca1a-712b-40cb-8943-303ae8542cab","Type":"ContainerStarted","Data":"bdb7721a9f28cc469efa17d1b1242306331fa704a0c41303b93df022be077322"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.585335 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-r2xnv" podStartSLOduration=121.585312958 podStartE2EDuration="2m1.585312958s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.580688211 +0000 UTC m=+138.505526839" watchObservedRunningTime="2025-12-09 15:00:59.585312958 +0000 UTC m=+138.510151587" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.613671 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" event={"ID":"fd66c484-656a-4a2e-a3e9-2b7a32ba2def","Type":"ContainerStarted","Data":"82b5dfabb9b1e032890f6cf5359247919d07bae2191ffb8ca2bf45e9b23c2e34"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.623304 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:59 crc kubenswrapper[4735]: E1209 15:00:59.626077 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.126058621 +0000 UTC m=+139.050897249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.638844 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hwpt4" podStartSLOduration=121.63882564 podStartE2EDuration="2m1.63882564s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.630662769 +0000 UTC m=+138.555501397" watchObservedRunningTime="2025-12-09 15:00:59.63882564 +0000 UTC m=+138.563664268" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.708875 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" podStartSLOduration=121.708857903 podStartE2EDuration="2m1.708857903s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.707934369 +0000 UTC m=+138.632772998" watchObservedRunningTime="2025-12-09 15:00:59.708857903 +0000 UTC m=+138.633696530" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.709285 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" event={"ID":"4dee35e0-d22b-4610-90a0-9dd076a8b6a5","Type":"ContainerStarted","Data":"094c5e25a71b2d5bb506cfdf62a45486c54f62659ff8d933214f5b8fea8bac83"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.711029 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-2hgjx" podStartSLOduration=5.7110169840000005 podStartE2EDuration="5.711016984s" podCreationTimestamp="2025-12-09 15:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.662109192 +0000 UTC m=+138.586947820" watchObservedRunningTime="2025-12-09 15:00:59.711016984 +0000 UTC m=+138.635855612" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.723813 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:59 crc kubenswrapper[4735]: E1209 15:00:59.724689 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.224669924 +0000 UTC m=+139.149508551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.735752 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" podStartSLOduration=121.735730284 podStartE2EDuration="2m1.735730284s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.734693516 +0000 UTC m=+138.659532144" watchObservedRunningTime="2025-12-09 15:00:59.735730284 +0000 UTC m=+138.660568913" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.784880 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" event={"ID":"105fe85b-f861-46f6-b12b-139c5f0a7780","Type":"ContainerStarted","Data":"535d633f1e2299565d93a65233eff2e4076508279defa26aee673ec2c3983c8b"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.784955 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" event={"ID":"105fe85b-f861-46f6-b12b-139c5f0a7780","Type":"ContainerStarted","Data":"835ad96858d4baa08de04189331eda28130a78c300c5387b70b39df1fb2b8bb3"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.788719 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" podStartSLOduration=120.788705249 podStartE2EDuration="2m0.788705249s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.786750258 +0000 UTC m=+138.711588885" watchObservedRunningTime="2025-12-09 15:00:59.788705249 +0000 UTC m=+138.713543868" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.797843 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" event={"ID":"4c6c87b9-fe9c-45db-bfaa-031a208177db","Type":"ContainerStarted","Data":"f9781a9c8a28c572ab942c9a12d282e67f9ce5ac4fed77fee77e1ac0c6a90bde"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.798659 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.831269 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:00:59 crc kubenswrapper[4735]: E1209 15:00:59.831650 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.331635251 +0000 UTC m=+139.256473879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.834365 4735 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tbgl7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.834411 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" podUID="4c6c87b9-fe9c-45db-bfaa-031a208177db" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.858534 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" podStartSLOduration=120.85850088 podStartE2EDuration="2m0.85850088s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.821739433 +0000 UTC m=+138.746578061" watchObservedRunningTime="2025-12-09 15:00:59.85850088 +0000 UTC m=+138.783339508" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.859276 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" podStartSLOduration=120.859269548 podStartE2EDuration="2m0.859269548s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.856834951 +0000 UTC m=+138.781673579" watchObservedRunningTime="2025-12-09 15:00:59.859269548 +0000 UTC m=+138.784108176" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.884812 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" event={"ID":"fc05d12f-eaa9-48ae-b280-e449caed078c","Type":"ContainerStarted","Data":"79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.884841 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" event={"ID":"fc05d12f-eaa9-48ae-b280-e449caed078c","Type":"ContainerStarted","Data":"5eb2457ec378bdcab85caa6e93b6491a04544ba4bd16a63c4358516e66caaef8"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.885858 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.907335 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctqxr" podStartSLOduration=120.907311497 podStartE2EDuration="2m0.907311497s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.906136824 +0000 UTC m=+138.830975452" watchObservedRunningTime="2025-12-09 15:00:59.907311497 +0000 UTC m=+138.832150125" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.908681 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rxhld container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.908716 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.935569 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jgd5m" podStartSLOduration=121.935550457 podStartE2EDuration="2m1.935550457s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.93354531 +0000 UTC m=+138.858383938" watchObservedRunningTime="2025-12-09 15:00:59.935550457 +0000 UTC m=+138.860389085" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.936613 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:00:59 crc kubenswrapper[4735]: E1209 15:00:59.937743 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.437729136 +0000 UTC m=+139.362567764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.947147 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" event={"ID":"5a589e0e-3989-407f-a8c3-5b2391bddc09","Type":"ContainerStarted","Data":"2031d594b11fa451af0219a3a8c832e89ef305ee83e8961845281fa1de9e41a9"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.961838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" event={"ID":"f37c5317-cf9b-44be-a65d-982bbe6a0473","Type":"ContainerStarted","Data":"c73f06d42c1a8296f8e9baf896c5ecd2ad01dc59e66a15ce3ae83e8f9c6b7852"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.961889 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" event={"ID":"f37c5317-cf9b-44be-a65d-982bbe6a0473","Type":"ContainerStarted","Data":"607dff7ef4d969436259506b58548e9133672e7d97e24fa24af1b884ccacefab"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.982684 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" podStartSLOduration=120.98266223 podStartE2EDuration="2m0.98266223s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:00:59.978251873 +0000 UTC m=+138.903090501" watchObservedRunningTime="2025-12-09 15:00:59.98266223 +0000 UTC m=+138.907500858" Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.987781 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6kh8f" event={"ID":"f07aeefa-cf94-4c49-ac0a-bec93a1c65f6","Type":"ContainerStarted","Data":"37e28d7eff6fdaf0b7cebe8cb8deb2b57efdb7fc9ba4549acc453f2bcfb195a6"} Dec 09 15:00:59 crc kubenswrapper[4735]: I1209 15:00:59.987815 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6kh8f" event={"ID":"f07aeefa-cf94-4c49-ac0a-bec93a1c65f6","Type":"ContainerStarted","Data":"3c290d8f7790bda6a1bb827c6861b99bdcf6ee0b04e5c286909398c402764b5d"} Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.011799 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6b7z" podStartSLOduration=121.011779968 podStartE2EDuration="2m1.011779968s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:00.008669622 +0000 UTC m=+138.933508250" watchObservedRunningTime="2025-12-09 15:01:00.011779968 +0000 UTC m=+138.936618596" Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.017334 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.038432 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.046218 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.546205337 +0000 UTC m=+139.471043965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.097216 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-96lk5" podStartSLOduration=122.09719867 podStartE2EDuration="2m2.09719867s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:00.048949976 +0000 UTC m=+138.973788604" watchObservedRunningTime="2025-12-09 15:01:00.09719867 +0000 UTC m=+139.022037298" Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.149931 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6kh8f" podStartSLOduration=6.149910523 podStartE2EDuration="6.149910523s" podCreationTimestamp="2025-12-09 15:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:00.148596004 +0000 UTC m=+139.073434631" watchObservedRunningTime="2025-12-09 15:01:00.149910523 +0000 UTC m=+139.074749151" Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.150063 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.150577 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.650561275 +0000 UTC m=+139.575399902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.251546 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.252099 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.752087971 +0000 UTC m=+139.676926599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.344614 4735 patch_prober.go:28] interesting pod/router-default-5444994796-n8ljm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:01:00 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Dec 09 15:01:00 crc kubenswrapper[4735]: [+]process-running ok Dec 09 15:01:00 crc kubenswrapper[4735]: healthz check failed Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.344658 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n8ljm" podUID="4da02f65-748e-42ee-82d8-4cd5445d9fab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.353492 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.353823 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.853792127 +0000 UTC m=+139.778630755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.353915 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.369827 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.869797347 +0000 UTC m=+139.794635975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.456499 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.456768 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.956754454 +0000 UTC m=+139.881593083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.456947 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.457244 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:00.957236154 +0000 UTC m=+139.882074782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.558153 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.558662 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:01.05865245 +0000 UTC m=+139.983491078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.659709 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.659951 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:01.159942926 +0000 UTC m=+140.084781554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.760226 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.760654 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:01.260642624 +0000 UTC m=+140.185481252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.862018 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.862272 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:01.362261346 +0000 UTC m=+140.287099965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:00 crc kubenswrapper[4735]: I1209 15:01:00.962582 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:00 crc kubenswrapper[4735]: E1209 15:01:00.963141 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:01.463128485 +0000 UTC m=+140.387967112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.015090 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" event={"ID":"32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd","Type":"ContainerStarted","Data":"4583f4f2c3f5c4f69fe00df675f7105d4e78840c982d9892c8467f520c83859e"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.015130 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" event={"ID":"32dbe2cb-38d9-4c58-8bc0-8b5bb57141fd","Type":"ContainerStarted","Data":"cd19a1237d42af63c3d10bafe60ad75854da01c0296f6cf91a1476fe4ed4c240"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.025663 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z" event={"ID":"381225e4-030b-401b-a1c6-8926f3a806b7","Type":"ContainerStarted","Data":"097e171d01fa164e995a0fe2d49b2de4d5fd16dd3e88fa5cec4538503c50ccb1"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.025693 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z" event={"ID":"381225e4-030b-401b-a1c6-8926f3a806b7","Type":"ContainerStarted","Data":"eb016b1dd87ce3c39a0d09609bc39cf52e3b91efe4eda6042738f2fed2c4e0a9"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.048084 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5rzqx" podStartSLOduration=122.048075345 podStartE2EDuration="2m2.048075345s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.0324762 +0000 UTC m=+139.957314829" watchObservedRunningTime="2025-12-09 15:01:01.048075345 +0000 UTC m=+139.972913973" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.058389 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" event={"ID":"fd66c484-656a-4a2e-a3e9-2b7a32ba2def","Type":"ContainerStarted","Data":"62924d35e4b3b681cdc69153d50a8d82cb534ed642c79640f20c197a1a348891"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.063556 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:01 crc kubenswrapper[4735]: E1209 15:01:01.063794 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:01.563785131 +0000 UTC m=+140.488623749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.081733 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" event={"ID":"9348986d-f923-4e2d-86c2-6b9886b736f5","Type":"ContainerStarted","Data":"1a7f08b52918b9c22b0bacf7abbddb57ab2077ca76d067f516ea3bf8a093d5b9"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.081778 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" event={"ID":"9348986d-f923-4e2d-86c2-6b9886b736f5","Type":"ContainerStarted","Data":"686dae5ec47cba862b1adc9603ebf812ebef23c951705ac4cd407f2ff994c3c6"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.082366 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.085648 4735 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xxlf7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.085691 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" podUID="9348986d-f923-4e2d-86c2-6b9886b736f5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.092119 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w8m5z" podStartSLOduration=122.092110657 podStartE2EDuration="2m2.092110657s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.049837639 +0000 UTC m=+139.974676267" watchObservedRunningTime="2025-12-09 15:01:01.092110657 +0000 UTC m=+140.016949285" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.100821 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" event={"ID":"3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf","Type":"ContainerStarted","Data":"a8856e7e80580c623e91351107754e3f2b85f83e5a3188a5f36830f99f009d3d"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.104618 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" event={"ID":"2be1ce89-c4a7-4237-b0f3-221ac11f813a","Type":"ContainerStarted","Data":"58c5da9e455b5c0282fa977af323b4022669c1513760aa88f09dca1e004b4934"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.108581 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" event={"ID":"45325644-f6e0-48dd-ac6f-05d02cbee704","Type":"ContainerStarted","Data":"5d278913a5cdd74ddb2d872d7f355b934134d853f1eb040ee5fdbefd6c2bc28f"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.108608 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" event={"ID":"45325644-f6e0-48dd-ac6f-05d02cbee704","Type":"ContainerStarted","Data":"f7f864990db87bb4e6e0f53b3a0cd87be61bf930658116d53d12c2e0b05c67fe"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.108618 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" event={"ID":"45325644-f6e0-48dd-ac6f-05d02cbee704","Type":"ContainerStarted","Data":"feeb04b07c5de9cd43df977be2f6b60eee09110b5c4a24a9217338e6b1c63c59"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.108966 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.113259 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" event={"ID":"db303c23-142d-43ce-ba09-59581443cc4e","Type":"ContainerStarted","Data":"4c9841edbfbdc4ab016174eb7654d8ffe487822ea13db493f7a258c6d87544de"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.113293 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" event={"ID":"db303c23-142d-43ce-ba09-59581443cc4e","Type":"ContainerStarted","Data":"55b73ba2a1d744ecba4967369135b43ea9764a11b6dd0e2498d65dcc7ceffaed"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.113793 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.116749 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" event={"ID":"67a91130-56ef-4053-b062-ed2dcde04121","Type":"ContainerStarted","Data":"0e60cfbcac47f3c5482f54ed2469df2bebe549f363b4600ab274bde5342e201f"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.116774 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" event={"ID":"67a91130-56ef-4053-b062-ed2dcde04121","Type":"ContainerStarted","Data":"5311d4d0cd105df0cc39b57bb0af3ab16389eef099bd4825eaa41eb5078a85ba"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.121078 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wtqjc" podStartSLOduration=122.12106894 podStartE2EDuration="2m2.12106894s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.092793831 +0000 UTC m=+140.017632459" watchObservedRunningTime="2025-12-09 15:01:01.12106894 +0000 UTC m=+140.045907569" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.125669 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z" event={"ID":"63bb7490-d246-4f4b-9db9-bb254344f4c9","Type":"ContainerStarted","Data":"2851b9378be1241cb46c9c865f2961eb548a7b0ea989c788f5f21bac96aa23f5"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.125715 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z" event={"ID":"63bb7490-d246-4f4b-9db9-bb254344f4c9","Type":"ContainerStarted","Data":"8dcd1f3794214b789e9ad0e58c32bc5ca95896ae7fbd468fb5db82988662e437"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.125726 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z" event={"ID":"63bb7490-d246-4f4b-9db9-bb254344f4c9","Type":"ContainerStarted","Data":"b39ee397c2c3e452fb7a187423f79c264e55e4532805e45fbfd68524c361bda7"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.126455 4735 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qwjzw container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.126490 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" podUID="db303c23-142d-43ce-ba09-59581443cc4e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.141508 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" event={"ID":"fb600470-3773-4c44-9069-8b8aa7c18bd6","Type":"ContainerStarted","Data":"3016e4d81847b916cdf815542f040724055b96671e57dabb325d277fd9ff139d"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.141575 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" event={"ID":"fb600470-3773-4c44-9069-8b8aa7c18bd6","Type":"ContainerStarted","Data":"c9b336d16075a488778cd2437eb223c3e4086b8653f096dfce24fed29c841c18"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.141587 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" event={"ID":"fb600470-3773-4c44-9069-8b8aa7c18bd6","Type":"ContainerStarted","Data":"ec9094c02cac208cb92e92614bf1acc52bfc62628471ee324e0c01f19e3bd10f"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.145364 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" podStartSLOduration=61.145356269 podStartE2EDuration="1m1.145356269s" podCreationTimestamp="2025-12-09 15:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.142743081 +0000 UTC m=+140.067581708" watchObservedRunningTime="2025-12-09 15:01:01.145356269 +0000 UTC m=+140.070194896" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.145778 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" podStartSLOduration=122.145773855 podStartE2EDuration="2m2.145773855s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.125712894 +0000 UTC m=+140.050551522" watchObservedRunningTime="2025-12-09 15:01:01.145773855 +0000 UTC m=+140.070612483" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.151072 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" event={"ID":"1ec89240-f1c2-444a-9114-5cf83fab7e9d","Type":"ContainerStarted","Data":"9c17d6487e961da4956e57312a926c7c95b59ce2ed9d35608869a207e6bb30cb"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.151099 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" event={"ID":"1ec89240-f1c2-444a-9114-5cf83fab7e9d","Type":"ContainerStarted","Data":"f8fa71bf452d841a9fe1fe148034ad0aea695565103acdf8ec07f4cd94c29190"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.169011 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:01 crc kubenswrapper[4735]: E1209 15:01:01.170103 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:01.67009088 +0000 UTC m=+140.594929507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.171577 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" podStartSLOduration=122.171569963 podStartE2EDuration="2m2.171569963s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.170932526 +0000 UTC m=+140.095771154" watchObservedRunningTime="2025-12-09 15:01:01.171569963 +0000 UTC m=+140.096408591" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.175679 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" event={"ID":"4c6c87b9-fe9c-45db-bfaa-031a208177db","Type":"ContainerStarted","Data":"e8664fccc527766540f4e57214bf8c6c5327b82be5d5c9fd2ad08dd4df32d16f"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.186653 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tbgl7" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.190756 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" event={"ID":"3e9eeba2-ec44-48dd-9325-9966670acd75","Type":"ContainerStarted","Data":"c7547bc73163b2b2a233ace73ce965d85eb5c774bd1b2066d93cf3e2770563c4"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.193850 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8r6dv" event={"ID":"ffd3099d-e717-438c-a2ce-591b598cd50e","Type":"ContainerStarted","Data":"aab7a16e1eb9fd8757ca7086489addf6180c48bd10ce1928686680f98262f126"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.193876 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8r6dv" event={"ID":"ffd3099d-e717-438c-a2ce-591b598cd50e","Type":"ContainerStarted","Data":"7a46b1830e3465ab28004935c068801d44ec19d0f5a17c9a72960121f9250081"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.194198 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-8r6dv" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.195980 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" podStartSLOduration=122.195974014 podStartE2EDuration="2m2.195974014s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.194235656 +0000 UTC m=+140.119074284" watchObservedRunningTime="2025-12-09 15:01:01.195974014 +0000 UTC m=+140.120812632" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.206654 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" event={"ID":"e0a4d2fb-404c-4f5a-b8b4-29e04d60fedf","Type":"ContainerStarted","Data":"85ccd6c21f8ca928131a880ae4d0e43608d6efa1193a5dbe5eeb3a8e5daa1fc0"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.215273 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" podStartSLOduration=122.215265175 podStartE2EDuration="2m2.215265175s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.213779098 +0000 UTC m=+140.138617727" watchObservedRunningTime="2025-12-09 15:01:01.215265175 +0000 UTC m=+140.140103804" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.222165 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-pcdrs" event={"ID":"e4daca1a-712b-40cb-8943-303ae8542cab","Type":"ContainerStarted","Data":"d5ecdb987cf2abbe002efe25409a8619290e99a4972fcd3286236c046d9f3f9f"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.234957 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" event={"ID":"da912d07-0a05-4d1c-b042-82d8a3b23467","Type":"ContainerStarted","Data":"e50fa01b51886c45035c02b75bf6e593134ddd5dcfe1ebbd9f0aa55dfadc02ce"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.255110 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" event={"ID":"69643ed9-426d-4f75-acf0-2871fd6f6f9e","Type":"ContainerStarted","Data":"f9025b8c888427d816104ba8ddd670e5711c67caee91b6b0eabd4641cdb7905e"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.255136 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" event={"ID":"69643ed9-426d-4f75-acf0-2871fd6f6f9e","Type":"ContainerStarted","Data":"daa598c2664f9b1fbd647ac3bccde2cbbfd84c53fa487eb6df806a303dcef866"} Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.260874 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-r2xnv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.260918 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r2xnv" podUID="b835f641-1777-4869-8ae7-161e8f528229" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.261165 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rxhld container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.261191 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.269883 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.270380 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:01 crc kubenswrapper[4735]: E1209 15:01:01.271580 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:01.771569224 +0000 UTC m=+140.696407852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.275740 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mr9zt" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.297032 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-r248q" podStartSLOduration=122.297011356 podStartE2EDuration="2m2.297011356s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.25467619 +0000 UTC m=+140.179514818" watchObservedRunningTime="2025-12-09 15:01:01.297011356 +0000 UTC m=+140.221849985" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.325067 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8vprh" podStartSLOduration=122.325048702 podStartE2EDuration="2m2.325048702s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.323806001 +0000 UTC m=+140.248644628" watchObservedRunningTime="2025-12-09 15:01:01.325048702 +0000 UTC m=+140.249887330" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.351334 4735 patch_prober.go:28] interesting pod/router-default-5444994796-n8ljm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:01:01 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Dec 09 15:01:01 crc kubenswrapper[4735]: [+]process-running ok Dec 09 15:01:01 crc kubenswrapper[4735]: healthz check failed Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.351403 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n8ljm" podUID="4da02f65-748e-42ee-82d8-4cd5445d9fab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.381112 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:01 crc kubenswrapper[4735]: E1209 15:01:01.383152 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:01.883131386 +0000 UTC m=+140.807970014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.400984 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-p77wx" podStartSLOduration=122.400968372 podStartE2EDuration="2m2.400968372s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.363981284 +0000 UTC m=+140.288819912" watchObservedRunningTime="2025-12-09 15:01:01.400968372 +0000 UTC m=+140.325807001" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.401103 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rzc8z" podStartSLOduration=122.401099352 podStartE2EDuration="2m2.401099352s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.398818749 +0000 UTC m=+140.323657377" watchObservedRunningTime="2025-12-09 15:01:01.401099352 +0000 UTC m=+140.325937980" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.433266 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6hv2p" podStartSLOduration=122.433250409 podStartE2EDuration="2m2.433250409s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.432204064 +0000 UTC m=+140.357042691" watchObservedRunningTime="2025-12-09 15:01:01.433250409 +0000 UTC m=+140.358089038" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.486219 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:01 crc kubenswrapper[4735]: E1209 15:01:01.486593 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:01.986580693 +0000 UTC m=+140.911419321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.494051 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8r6dv" podStartSLOduration=7.494040913 podStartE2EDuration="7.494040913s" podCreationTimestamp="2025-12-09 15:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.490326493 +0000 UTC m=+140.415165121" watchObservedRunningTime="2025-12-09 15:01:01.494040913 +0000 UTC m=+140.418879541" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.589908 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:01 crc kubenswrapper[4735]: E1209 15:01:01.590259 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:02.090246031 +0000 UTC m=+141.015084660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.618174 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" podStartSLOduration=123.618161795 podStartE2EDuration="2m3.618161795s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.61597387 +0000 UTC m=+140.540812498" watchObservedRunningTime="2025-12-09 15:01:01.618161795 +0000 UTC m=+140.543000423" Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.695908 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:01 crc kubenswrapper[4735]: E1209 15:01:01.696608 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:02.196591166 +0000 UTC m=+141.121429795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.797320 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:01 crc kubenswrapper[4735]: E1209 15:01:01.797885 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:02.297867124 +0000 UTC m=+141.222705752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.898807 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:01 crc kubenswrapper[4735]: E1209 15:01:01.899098 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:02.399085012 +0000 UTC m=+141.323923640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:01 crc kubenswrapper[4735]: I1209 15:01:01.999450 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:01 crc kubenswrapper[4735]: E1209 15:01:01.999848 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:02.499834576 +0000 UTC m=+141.424673204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.102424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:02 crc kubenswrapper[4735]: E1209 15:01:02.102811 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:02.602796693 +0000 UTC m=+141.527635321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.175547 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.175607 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.203452 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:02 crc kubenswrapper[4735]: E1209 15:01:02.203697 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:02.703677767 +0000 UTC m=+141.628516395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.203837 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:02 crc kubenswrapper[4735]: E1209 15:01:02.204194 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:02.704185155 +0000 UTC m=+141.629023783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.268173 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8r6dv" event={"ID":"ffd3099d-e717-438c-a2ce-591b598cd50e","Type":"ContainerStarted","Data":"88d1bfd4ef311f89fa9e4669cd7fc3a981056a77f09d131892f3882c1e61bebf"} Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.272820 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.278043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" event={"ID":"2be1ce89-c4a7-4237-b0f3-221ac11f813a","Type":"ContainerStarted","Data":"2553d3653b9e5890be62ec1a301f97ae9a2456001a67ab9099c8f9b844abe69f"} Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.278078 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" event={"ID":"2be1ce89-c4a7-4237-b0f3-221ac11f813a","Type":"ContainerStarted","Data":"2af2f6154face707ae5f7a1f9ec075583891a06a3e64105af0a83de3875a9a62"} Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.290638 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" event={"ID":"da912d07-0a05-4d1c-b042-82d8a3b23467","Type":"ContainerStarted","Data":"21b995d1c4f15b14807b021ec76715d31506238a285dc4db14faac465e068120"} Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.291040 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-r2xnv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.291081 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-r2xnv" podUID="b835f641-1777-4869-8ae7-161e8f528229" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.36:8080/\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.302870 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5xt67" podStartSLOduration=123.302860911 podStartE2EDuration="2m3.302860911s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:01.641767473 +0000 UTC m=+140.566606101" watchObservedRunningTime="2025-12-09 15:01:02.302860911 +0000 UTC m=+141.227699538" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.304879 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.305417 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:02 crc kubenswrapper[4735]: E1209 15:01:02.305888 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:02.805867429 +0000 UTC m=+141.730706057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.310926 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xxlf7" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.321570 4735 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-cgl2q container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 09 15:01:02 crc kubenswrapper[4735]: [+]log ok Dec 09 15:01:02 crc kubenswrapper[4735]: [+]etcd ok Dec 09 15:01:02 crc kubenswrapper[4735]: [+]etcd-readiness ok Dec 09 15:01:02 crc kubenswrapper[4735]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 09 15:01:02 crc kubenswrapper[4735]: [-]informer-sync failed: reason withheld Dec 09 15:01:02 crc kubenswrapper[4735]: [+]poststarthook/generic-apiserver-start-informers ok Dec 09 15:01:02 crc kubenswrapper[4735]: [+]poststarthook/max-in-flight-filter ok Dec 09 15:01:02 crc kubenswrapper[4735]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 09 15:01:02 crc kubenswrapper[4735]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 09 15:01:02 crc kubenswrapper[4735]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 09 15:01:02 crc kubenswrapper[4735]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 09 15:01:02 crc kubenswrapper[4735]: [+]shutdown ok Dec 09 15:01:02 crc kubenswrapper[4735]: readyz check failed Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.321624 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" podUID="3a64d19a-6cab-4186-9ae0-dd5f8e93d7bf" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.353359 4735 patch_prober.go:28] interesting pod/router-default-5444994796-n8ljm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:01:02 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Dec 09 15:01:02 crc kubenswrapper[4735]: [+]process-running ok Dec 09 15:01:02 crc kubenswrapper[4735]: healthz check failed Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.353415 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n8ljm" podUID="4da02f65-748e-42ee-82d8-4cd5445d9fab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.368072 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qwjzw" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.395614 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.395665 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.406993 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:02 crc kubenswrapper[4735]: E1209 15:01:02.410356 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-09 15:01:02.910339839 +0000 UTC m=+141.835178467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gwkvf" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.443866 4735 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.509217 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:02 crc kubenswrapper[4735]: E1209 15:01:02.509728 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-09 15:01:03.009712945 +0000 UTC m=+141.934551573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.555651 4735 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-09T15:01:02.44406502Z","Handler":null,"Name":""} Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.592640 4735 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.592678 4735 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.611268 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.613472 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.613507 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.634414 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gwkvf\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.712584 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.715267 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.717219 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.802243 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-78zr9"] Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.814008 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.816774 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.828979 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78zr9"] Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.887067 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gwkvf"] Dec 09 15:01:02 crc kubenswrapper[4735]: W1209 15:01:02.892174 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d00084f_8f8b_4cd8_93b6_a05ec4ac5891.slice/crio-9ed1c265781dbd87d9a6bdaa3f19cd4be724674ae1858e5c80b0dcb4f4c9b716 WatchSource:0}: Error finding container 9ed1c265781dbd87d9a6bdaa3f19cd4be724674ae1858e5c80b0dcb4f4c9b716: Status 404 returned error can't find the container with id 9ed1c265781dbd87d9a6bdaa3f19cd4be724674ae1858e5c80b0dcb4f4c9b716 Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.916703 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-utilities\") pod \"community-operators-78zr9\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.916764 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6bts\" (UniqueName: \"kubernetes.io/projected/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-kube-api-access-t6bts\") pod \"community-operators-78zr9\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:02 crc kubenswrapper[4735]: I1209 15:01:02.916819 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-catalog-content\") pod \"community-operators-78zr9\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.003209 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xmc6l"] Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.004173 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.005625 4735 patch_prober.go:28] interesting pod/apiserver-76f77b778f-t5fvv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 09 15:01:03 crc kubenswrapper[4735]: [+]log ok Dec 09 15:01:03 crc kubenswrapper[4735]: [+]etcd ok Dec 09 15:01:03 crc kubenswrapper[4735]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 09 15:01:03 crc kubenswrapper[4735]: [+]poststarthook/generic-apiserver-start-informers ok Dec 09 15:01:03 crc kubenswrapper[4735]: [+]poststarthook/max-in-flight-filter ok Dec 09 15:01:03 crc kubenswrapper[4735]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 09 15:01:03 crc kubenswrapper[4735]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 09 15:01:03 crc kubenswrapper[4735]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 09 15:01:03 crc kubenswrapper[4735]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 09 15:01:03 crc kubenswrapper[4735]: [+]poststarthook/project.openshift.io-projectcache ok Dec 09 15:01:03 crc kubenswrapper[4735]: [-]poststarthook/project.openshift.io-projectauthorizationcache failed: reason withheld Dec 09 15:01:03 crc kubenswrapper[4735]: [+]poststarthook/openshift.io-startinformers ok Dec 09 15:01:03 crc kubenswrapper[4735]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 09 15:01:03 crc kubenswrapper[4735]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 09 15:01:03 crc kubenswrapper[4735]: livez check failed Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.005671 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" podUID="da912d07-0a05-4d1c-b042-82d8a3b23467" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.006097 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.013162 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xmc6l"] Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.018588 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-catalog-content\") pod \"community-operators-78zr9\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.018736 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-utilities\") pod \"community-operators-78zr9\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.018799 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6bts\" (UniqueName: \"kubernetes.io/projected/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-kube-api-access-t6bts\") pod \"community-operators-78zr9\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.019005 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-catalog-content\") pod \"community-operators-78zr9\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.019133 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-utilities\") pod \"community-operators-78zr9\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.039177 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6bts\" (UniqueName: \"kubernetes.io/projected/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-kube-api-access-t6bts\") pod \"community-operators-78zr9\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.120173 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-utilities\") pod \"certified-operators-xmc6l\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.120221 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-catalog-content\") pod \"certified-operators-xmc6l\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.120272 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcwbv\" (UniqueName: \"kubernetes.io/projected/7ea62e8b-5dc1-4527-8279-2845d3666202-kube-api-access-xcwbv\") pod \"certified-operators-xmc6l\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.134901 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.200490 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-578s6"] Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.201498 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.212863 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-578s6"] Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.222023 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-utilities\") pod \"certified-operators-xmc6l\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.222057 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-catalog-content\") pod \"certified-operators-xmc6l\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.222088 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcwbv\" (UniqueName: \"kubernetes.io/projected/7ea62e8b-5dc1-4527-8279-2845d3666202-kube-api-access-xcwbv\") pod \"certified-operators-xmc6l\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.222983 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-utilities\") pod \"certified-operators-xmc6l\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.223190 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-catalog-content\") pod \"certified-operators-xmc6l\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.235760 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcwbv\" (UniqueName: \"kubernetes.io/projected/7ea62e8b-5dc1-4527-8279-2845d3666202-kube-api-access-xcwbv\") pod \"certified-operators-xmc6l\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.297379 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78zr9"] Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.299087 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" event={"ID":"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891","Type":"ContainerStarted","Data":"0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5"} Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.299124 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" event={"ID":"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891","Type":"ContainerStarted","Data":"9ed1c265781dbd87d9a6bdaa3f19cd4be724674ae1858e5c80b0dcb4f4c9b716"} Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.299883 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.302043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" event={"ID":"2be1ce89-c4a7-4237-b0f3-221ac11f813a","Type":"ContainerStarted","Data":"e7a6808a8da618979a7f0fd8feac02380273ef512ac8db2388dbebc698a3710e"} Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.303479 4735 generic.go:334] "Generic (PLEG): container finished" podID="67a91130-56ef-4053-b062-ed2dcde04121" containerID="0e60cfbcac47f3c5482f54ed2469df2bebe549f363b4600ab274bde5342e201f" exitCode=0 Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.303858 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" event={"ID":"67a91130-56ef-4053-b062-ed2dcde04121","Type":"ContainerDied","Data":"0e60cfbcac47f3c5482f54ed2469df2bebe549f363b4600ab274bde5342e201f"} Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.310468 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-cgl2q" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.314203 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.315140 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" podStartSLOduration=124.315129377 podStartE2EDuration="2m4.315129377s" podCreationTimestamp="2025-12-09 14:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:03.313825899 +0000 UTC m=+142.238664527" watchObservedRunningTime="2025-12-09 15:01:03.315129377 +0000 UTC m=+142.239968004" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.323663 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-catalog-content\") pod \"community-operators-578s6\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.323698 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwkdw\" (UniqueName: \"kubernetes.io/projected/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-kube-api-access-zwkdw\") pod \"community-operators-578s6\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.323717 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-utilities\") pod \"community-operators-578s6\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.351012 4735 patch_prober.go:28] interesting pod/router-default-5444994796-n8ljm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:01:03 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Dec 09 15:01:03 crc kubenswrapper[4735]: [+]process-running ok Dec 09 15:01:03 crc kubenswrapper[4735]: healthz check failed Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.351281 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n8ljm" podUID="4da02f65-748e-42ee-82d8-4cd5445d9fab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.401852 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ms5pn" podStartSLOduration=9.401835285 podStartE2EDuration="9.401835285s" podCreationTimestamp="2025-12-09 15:00:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:03.363799454 +0000 UTC m=+142.288638083" watchObservedRunningTime="2025-12-09 15:01:03.401835285 +0000 UTC m=+142.326673913" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.402786 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-krqpw"] Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.403754 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.421316 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.421885 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krqpw"] Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.424620 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-catalog-content\") pod \"community-operators-578s6\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.424652 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwkdw\" (UniqueName: \"kubernetes.io/projected/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-kube-api-access-zwkdw\") pod \"community-operators-578s6\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.424699 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-utilities\") pod \"community-operators-578s6\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.426003 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-catalog-content\") pod \"community-operators-578s6\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.426665 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-utilities\") pod \"community-operators-578s6\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.454222 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwkdw\" (UniqueName: \"kubernetes.io/projected/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-kube-api-access-zwkdw\") pod \"community-operators-578s6\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.526946 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.527297 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-catalog-content\") pod \"certified-operators-krqpw\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.527439 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-utilities\") pod \"certified-operators-krqpw\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.527467 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4kzj\" (UniqueName: \"kubernetes.io/projected/f0bcff89-5f52-43dc-ab10-6704a9143fbe-kube-api-access-v4kzj\") pod \"certified-operators-krqpw\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.629033 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-utilities\") pod \"certified-operators-krqpw\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.629315 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4kzj\" (UniqueName: \"kubernetes.io/projected/f0bcff89-5f52-43dc-ab10-6704a9143fbe-kube-api-access-v4kzj\") pod \"certified-operators-krqpw\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.629366 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-catalog-content\") pod \"certified-operators-krqpw\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.629819 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-catalog-content\") pod \"certified-operators-krqpw\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.629817 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-utilities\") pod \"certified-operators-krqpw\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.642924 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4kzj\" (UniqueName: \"kubernetes.io/projected/f0bcff89-5f52-43dc-ab10-6704a9143fbe-kube-api-access-v4kzj\") pod \"certified-operators-krqpw\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.664548 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-578s6"] Dec 09 15:01:03 crc kubenswrapper[4735]: W1209 15:01:03.669150 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf032c5cf_0cbe_44c7_9523_1e19e33a31d4.slice/crio-9ded272b5cdf5d58e74b275c7fe286523e9811528c2594636c7584639018e0f8 WatchSource:0}: Error finding container 9ded272b5cdf5d58e74b275c7fe286523e9811528c2594636c7584639018e0f8: Status 404 returned error can't find the container with id 9ded272b5cdf5d58e74b275c7fe286523e9811528c2594636c7584639018e0f8 Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.713894 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.745598 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xmc6l"] Dec 09 15:01:03 crc kubenswrapper[4735]: W1209 15:01:03.748349 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ea62e8b_5dc1_4527_8279_2845d3666202.slice/crio-5a9d48169f7184f3bd6cc8d23abb19afe436c4e0f6aa017050feb4dc288feca1 WatchSource:0}: Error finding container 5a9d48169f7184f3bd6cc8d23abb19afe436c4e0f6aa017050feb4dc288feca1: Status 404 returned error can't find the container with id 5a9d48169f7184f3bd6cc8d23abb19afe436c4e0f6aa017050feb4dc288feca1 Dec 09 15:01:03 crc kubenswrapper[4735]: I1209 15:01:03.861820 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krqpw"] Dec 09 15:01:03 crc kubenswrapper[4735]: W1209 15:01:03.862655 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0bcff89_5f52_43dc_ab10_6704a9143fbe.slice/crio-7ba9323c5e5364a2a48f5945c76cd48d2d29bdbe68fdf2318a293a6b8c41540d WatchSource:0}: Error finding container 7ba9323c5e5364a2a48f5945c76cd48d2d29bdbe68fdf2318a293a6b8c41540d: Status 404 returned error can't find the container with id 7ba9323c5e5364a2a48f5945c76cd48d2d29bdbe68fdf2318a293a6b8c41540d Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.310085 4735 generic.go:334] "Generic (PLEG): container finished" podID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" containerID="663edbab8a74ba6c1e77ee694d8b5b89c64263a29d05c3955d3f629d3c0dbf42" exitCode=0 Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.310147 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578s6" event={"ID":"f032c5cf-0cbe-44c7-9523-1e19e33a31d4","Type":"ContainerDied","Data":"663edbab8a74ba6c1e77ee694d8b5b89c64263a29d05c3955d3f629d3c0dbf42"} Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.310206 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578s6" event={"ID":"f032c5cf-0cbe-44c7-9523-1e19e33a31d4","Type":"ContainerStarted","Data":"9ded272b5cdf5d58e74b275c7fe286523e9811528c2594636c7584639018e0f8"} Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.311578 4735 generic.go:334] "Generic (PLEG): container finished" podID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerID="9252b8064543086ef061edeefbb362d1c96e985e1df2a399cd75c482a83923a1" exitCode=0 Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.311661 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krqpw" event={"ID":"f0bcff89-5f52-43dc-ab10-6704a9143fbe","Type":"ContainerDied","Data":"9252b8064543086ef061edeefbb362d1c96e985e1df2a399cd75c482a83923a1"} Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.311693 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krqpw" event={"ID":"f0bcff89-5f52-43dc-ab10-6704a9143fbe","Type":"ContainerStarted","Data":"7ba9323c5e5364a2a48f5945c76cd48d2d29bdbe68fdf2318a293a6b8c41540d"} Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.312456 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.313221 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ea62e8b-5dc1-4527-8279-2845d3666202" containerID="e55f8133aa3394b3a5ee58acd0715e211d0b85156deea49bb45bdf65da5b0d32" exitCode=0 Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.313291 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmc6l" event={"ID":"7ea62e8b-5dc1-4527-8279-2845d3666202","Type":"ContainerDied","Data":"e55f8133aa3394b3a5ee58acd0715e211d0b85156deea49bb45bdf65da5b0d32"} Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.313322 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmc6l" event={"ID":"7ea62e8b-5dc1-4527-8279-2845d3666202","Type":"ContainerStarted","Data":"5a9d48169f7184f3bd6cc8d23abb19afe436c4e0f6aa017050feb4dc288feca1"} Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.315696 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" containerID="28774878f109c140231839d5f1b76c8cb243f35e73561b219002f19d366c8a2c" exitCode=0 Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.315806 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78zr9" event={"ID":"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8","Type":"ContainerDied","Data":"28774878f109c140231839d5f1b76c8cb243f35e73561b219002f19d366c8a2c"} Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.315830 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78zr9" event={"ID":"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8","Type":"ContainerStarted","Data":"9bbcd08ee3dc5c12af3c9347823fd517b0e27680866609303332c6fd5fbbf590"} Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.335736 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.335783 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.348289 4735 patch_prober.go:28] interesting pod/router-default-5444994796-n8ljm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:01:04 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Dec 09 15:01:04 crc kubenswrapper[4735]: [+]process-running ok Dec 09 15:01:04 crc kubenswrapper[4735]: healthz check failed Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.348342 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n8ljm" podUID="4da02f65-748e-42ee-82d8-4cd5445d9fab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.483663 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.644160 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67a91130-56ef-4053-b062-ed2dcde04121-config-volume\") pod \"67a91130-56ef-4053-b062-ed2dcde04121\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.644250 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45gql\" (UniqueName: \"kubernetes.io/projected/67a91130-56ef-4053-b062-ed2dcde04121-kube-api-access-45gql\") pod \"67a91130-56ef-4053-b062-ed2dcde04121\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.644279 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67a91130-56ef-4053-b062-ed2dcde04121-secret-volume\") pod \"67a91130-56ef-4053-b062-ed2dcde04121\" (UID: \"67a91130-56ef-4053-b062-ed2dcde04121\") " Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.644757 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a91130-56ef-4053-b062-ed2dcde04121-config-volume" (OuterVolumeSpecName: "config-volume") pod "67a91130-56ef-4053-b062-ed2dcde04121" (UID: "67a91130-56ef-4053-b062-ed2dcde04121"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.649818 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67a91130-56ef-4053-b062-ed2dcde04121-kube-api-access-45gql" (OuterVolumeSpecName: "kube-api-access-45gql") pod "67a91130-56ef-4053-b062-ed2dcde04121" (UID: "67a91130-56ef-4053-b062-ed2dcde04121"). InnerVolumeSpecName "kube-api-access-45gql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.650246 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67a91130-56ef-4053-b062-ed2dcde04121-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "67a91130-56ef-4053-b062-ed2dcde04121" (UID: "67a91130-56ef-4053-b062-ed2dcde04121"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.745478 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67a91130-56ef-4053-b062-ed2dcde04121-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.745527 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45gql\" (UniqueName: \"kubernetes.io/projected/67a91130-56ef-4053-b062-ed2dcde04121-kube-api-access-45gql\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:04 crc kubenswrapper[4735]: I1209 15:01:04.745540 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67a91130-56ef-4053-b062-ed2dcde04121-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:04.999453 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-68nhq"] Dec 09 15:01:05 crc kubenswrapper[4735]: E1209 15:01:04.999691 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67a91130-56ef-4053-b062-ed2dcde04121" containerName="collect-profiles" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:04.999705 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a91130-56ef-4053-b062-ed2dcde04121" containerName="collect-profiles" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:04.999820 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="67a91130-56ef-4053-b062-ed2dcde04121" containerName="collect-profiles" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.000466 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.003304 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.007052 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-68nhq"] Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.151188 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-utilities\") pod \"redhat-marketplace-68nhq\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.151267 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-catalog-content\") pod \"redhat-marketplace-68nhq\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.151308 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqvgj\" (UniqueName: \"kubernetes.io/projected/f273a98e-41b2-45ac-b140-8c73aaeeed54-kube-api-access-vqvgj\") pod \"redhat-marketplace-68nhq\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.234192 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.235895 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.239997 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.240209 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.240247 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.253471 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-utilities\") pod \"redhat-marketplace-68nhq\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.253554 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-catalog-content\") pod \"redhat-marketplace-68nhq\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.253573 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqvgj\" (UniqueName: \"kubernetes.io/projected/f273a98e-41b2-45ac-b140-8c73aaeeed54-kube-api-access-vqvgj\") pod \"redhat-marketplace-68nhq\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.254300 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-catalog-content\") pod \"redhat-marketplace-68nhq\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.254862 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-utilities\") pod \"redhat-marketplace-68nhq\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.286546 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqvgj\" (UniqueName: \"kubernetes.io/projected/f273a98e-41b2-45ac-b140-8c73aaeeed54-kube-api-access-vqvgj\") pod \"redhat-marketplace-68nhq\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.325392 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.332140 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" event={"ID":"67a91130-56ef-4053-b062-ed2dcde04121","Type":"ContainerDied","Data":"5311d4d0cd105df0cc39b57bb0af3ab16389eef099bd4825eaa41eb5078a85ba"} Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.332232 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421540-x8zxv" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.332263 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5311d4d0cd105df0cc39b57bb0af3ab16389eef099bd4825eaa41eb5078a85ba" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.345959 4735 patch_prober.go:28] interesting pod/router-default-5444994796-n8ljm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:01:05 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Dec 09 15:01:05 crc kubenswrapper[4735]: [+]process-running ok Dec 09 15:01:05 crc kubenswrapper[4735]: healthz check failed Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.346001 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n8ljm" podUID="4da02f65-748e-42ee-82d8-4cd5445d9fab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.356035 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5576eea1-c9d0-4751-a19a-6f664fecc2b7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.356112 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5576eea1-c9d0-4751-a19a-6f664fecc2b7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.404494 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7qzlx"] Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.405819 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.408366 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qzlx"] Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.457226 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5576eea1-c9d0-4751-a19a-6f664fecc2b7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.457340 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5576eea1-c9d0-4751-a19a-6f664fecc2b7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.457782 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5576eea1-c9d0-4751-a19a-6f664fecc2b7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.473001 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5576eea1-c9d0-4751-a19a-6f664fecc2b7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.553816 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.558731 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-utilities\") pod \"redhat-marketplace-7qzlx\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.558810 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-catalog-content\") pod \"redhat-marketplace-7qzlx\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.558898 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjxqg\" (UniqueName: \"kubernetes.io/projected/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-kube-api-access-xjxqg\") pod \"redhat-marketplace-7qzlx\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.659797 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-utilities\") pod \"redhat-marketplace-7qzlx\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.659860 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-catalog-content\") pod \"redhat-marketplace-7qzlx\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.660335 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-utilities\") pod \"redhat-marketplace-7qzlx\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.660453 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-catalog-content\") pod \"redhat-marketplace-7qzlx\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.660495 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjxqg\" (UniqueName: \"kubernetes.io/projected/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-kube-api-access-xjxqg\") pod \"redhat-marketplace-7qzlx\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.677815 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjxqg\" (UniqueName: \"kubernetes.io/projected/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-kube-api-access-xjxqg\") pod \"redhat-marketplace-7qzlx\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.728044 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.749436 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-68nhq"] Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.795926 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 09 15:01:05 crc kubenswrapper[4735]: W1209 15:01:05.855779 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5576eea1_c9d0_4751_a19a_6f664fecc2b7.slice/crio-7d6a8cb0dc24004b6da33b65c61b2fcb8216f3d9d20c8a842d284d26399d40d9 WatchSource:0}: Error finding container 7d6a8cb0dc24004b6da33b65c61b2fcb8216f3d9d20c8a842d284d26399d40d9: Status 404 returned error can't find the container with id 7d6a8cb0dc24004b6da33b65c61b2fcb8216f3d9d20c8a842d284d26399d40d9 Dec 09 15:01:05 crc kubenswrapper[4735]: I1209 15:01:05.998893 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4mdmq"] Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.000102 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.002383 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.005048 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4mdmq"] Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.150042 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qzlx"] Dec 09 15:01:06 crc kubenswrapper[4735]: W1209 15:01:06.165885 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba7db0c8_c525_4eb8_8a40_4e0ad2d00914.slice/crio-73fa931fded0a317b390a5870bd367995699134a845b7a8de96e4735bca97a43 WatchSource:0}: Error finding container 73fa931fded0a317b390a5870bd367995699134a845b7a8de96e4735bca97a43: Status 404 returned error can't find the container with id 73fa931fded0a317b390a5870bd367995699134a845b7a8de96e4735bca97a43 Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.167313 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-catalog-content\") pod \"redhat-operators-4mdmq\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.167467 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-utilities\") pod \"redhat-operators-4mdmq\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.167531 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxj5p\" (UniqueName: \"kubernetes.io/projected/bc0a0b6f-c304-4350-ad88-813e4637cad7-kube-api-access-sxj5p\") pod \"redhat-operators-4mdmq\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.189961 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.190723 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.192673 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.192759 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.192761 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.268373 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-utilities\") pod \"redhat-operators-4mdmq\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.268430 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxj5p\" (UniqueName: \"kubernetes.io/projected/bc0a0b6f-c304-4350-ad88-813e4637cad7-kube-api-access-sxj5p\") pod \"redhat-operators-4mdmq\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.269179 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-catalog-content\") pod \"redhat-operators-4mdmq\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.269462 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-utilities\") pod \"redhat-operators-4mdmq\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.271240 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-catalog-content\") pod \"redhat-operators-4mdmq\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.285039 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxj5p\" (UniqueName: \"kubernetes.io/projected/bc0a0b6f-c304-4350-ad88-813e4637cad7-kube-api-access-sxj5p\") pod \"redhat-operators-4mdmq\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.312696 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.340842 4735 generic.go:334] "Generic (PLEG): container finished" podID="f273a98e-41b2-45ac-b140-8c73aaeeed54" containerID="7b1956e185fc529bf2ef3dd9b7074080237edcb3a38943aebe1303b2af914fa8" exitCode=0 Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.340913 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68nhq" event={"ID":"f273a98e-41b2-45ac-b140-8c73aaeeed54","Type":"ContainerDied","Data":"7b1956e185fc529bf2ef3dd9b7074080237edcb3a38943aebe1303b2af914fa8"} Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.340941 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68nhq" event={"ID":"f273a98e-41b2-45ac-b140-8c73aaeeed54","Type":"ContainerStarted","Data":"f319e668b4d3ad416e1720d5be5c087dc99febdf1e4abc355e92749f6a1e261c"} Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.344649 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5576eea1-c9d0-4751-a19a-6f664fecc2b7","Type":"ContainerStarted","Data":"49671bc7955b71dc35a6f1ed2adb36136f958f6ec3a66eb76753b194af00dab0"} Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.344685 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5576eea1-c9d0-4751-a19a-6f664fecc2b7","Type":"ContainerStarted","Data":"7d6a8cb0dc24004b6da33b65c61b2fcb8216f3d9d20c8a842d284d26399d40d9"} Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.346894 4735 generic.go:334] "Generic (PLEG): container finished" podID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" containerID="b0a3a145d9bd5a1626755b5a73b5d360acf2acc74c0086cd35e5b6e473488528" exitCode=0 Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.346948 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qzlx" event={"ID":"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914","Type":"ContainerDied","Data":"b0a3a145d9bd5a1626755b5a73b5d360acf2acc74c0086cd35e5b6e473488528"} Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.347172 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qzlx" event={"ID":"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914","Type":"ContainerStarted","Data":"73fa931fded0a317b390a5870bd367995699134a845b7a8de96e4735bca97a43"} Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.349398 4735 patch_prober.go:28] interesting pod/router-default-5444994796-n8ljm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:01:06 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Dec 09 15:01:06 crc kubenswrapper[4735]: [+]process-running ok Dec 09 15:01:06 crc kubenswrapper[4735]: healthz check failed Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.349422 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n8ljm" podUID="4da02f65-748e-42ee-82d8-4cd5445d9fab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.367445 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.367426674 podStartE2EDuration="1.367426674s" podCreationTimestamp="2025-12-09 15:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:06.364572216 +0000 UTC m=+145.289410844" watchObservedRunningTime="2025-12-09 15:01:06.367426674 +0000 UTC m=+145.292265303" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.370567 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c09fa57d-4e50-47b5-bd3b-8f260fdebfff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.370628 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c09fa57d-4e50-47b5-bd3b-8f260fdebfff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.417037 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tfnd2"] Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.425780 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.426906 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tfnd2"] Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.472033 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c09fa57d-4e50-47b5-bd3b-8f260fdebfff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.472073 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c09fa57d-4e50-47b5-bd3b-8f260fdebfff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.472664 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c09fa57d-4e50-47b5-bd3b-8f260fdebfff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.489908 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c09fa57d-4e50-47b5-bd3b-8f260fdebfff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.520425 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.576857 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr6lq\" (UniqueName: \"kubernetes.io/projected/6faf8566-a679-4c0f-866b-6f1b58c91769-kube-api-access-lr6lq\") pod \"redhat-operators-tfnd2\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.576950 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-catalog-content\") pod \"redhat-operators-tfnd2\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.576986 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-utilities\") pod \"redhat-operators-tfnd2\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.678204 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr6lq\" (UniqueName: \"kubernetes.io/projected/6faf8566-a679-4c0f-866b-6f1b58c91769-kube-api-access-lr6lq\") pod \"redhat-operators-tfnd2\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.678446 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-catalog-content\") pod \"redhat-operators-tfnd2\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.678477 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-utilities\") pod \"redhat-operators-tfnd2\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.679110 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-catalog-content\") pod \"redhat-operators-tfnd2\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.679142 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-utilities\") pod \"redhat-operators-tfnd2\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.692729 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr6lq\" (UniqueName: \"kubernetes.io/projected/6faf8566-a679-4c0f-866b-6f1b58c91769-kube-api-access-lr6lq\") pod \"redhat-operators-tfnd2\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.721027 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4mdmq"] Dec 09 15:01:06 crc kubenswrapper[4735]: W1209 15:01:06.729259 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc0a0b6f_c304_4350_ad88_813e4637cad7.slice/crio-6fbf98357bca8c43bf0c1a855b0826d317bacbb8fbd206e615a0aba66a1505f3 WatchSource:0}: Error finding container 6fbf98357bca8c43bf0c1a855b0826d317bacbb8fbd206e615a0aba66a1505f3: Status 404 returned error can't find the container with id 6fbf98357bca8c43bf0c1a855b0826d317bacbb8fbd206e615a0aba66a1505f3 Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.759720 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.898035 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.898074 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.899367 4735 patch_prober.go:28] interesting pod/console-f9d7485db-c4mlr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.899412 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-c4mlr" podUID="bfe12755-b370-474e-b856-82522f9b38d0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.921145 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tfnd2"] Dec 09 15:01:06 crc kubenswrapper[4735]: W1209 15:01:06.930496 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6faf8566_a679_4c0f_866b_6f1b58c91769.slice/crio-4614eb40caf3cd529894337b50381e56a48bebbe447989488814f53fd2e2f748 WatchSource:0}: Error finding container 4614eb40caf3cd529894337b50381e56a48bebbe447989488814f53fd2e2f748: Status 404 returned error can't find the container with id 4614eb40caf3cd529894337b50381e56a48bebbe447989488814f53fd2e2f748 Dec 09 15:01:06 crc kubenswrapper[4735]: I1209 15:01:06.938689 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 09 15:01:06 crc kubenswrapper[4735]: W1209 15:01:06.948339 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc09fa57d_4e50_47b5_bd3b_8f260fdebfff.slice/crio-844b452033fbc47dade8e08e05fdac5370c10a09958c05dcaf8198bc936d7d41 WatchSource:0}: Error finding container 844b452033fbc47dade8e08e05fdac5370c10a09958c05dcaf8198bc936d7d41: Status 404 returned error can't find the container with id 844b452033fbc47dade8e08e05fdac5370c10a09958c05dcaf8198bc936d7d41 Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.315238 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-r2xnv" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.342598 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.347075 4735 patch_prober.go:28] interesting pod/router-default-5444994796-n8ljm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 09 15:01:07 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Dec 09 15:01:07 crc kubenswrapper[4735]: [+]process-running ok Dec 09 15:01:07 crc kubenswrapper[4735]: healthz check failed Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.347127 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n8ljm" podUID="4da02f65-748e-42ee-82d8-4cd5445d9fab" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.355581 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c09fa57d-4e50-47b5-bd3b-8f260fdebfff","Type":"ContainerStarted","Data":"8cdc0c5c7ac1f96eaa410630a85d35517607d391dd7f590b858ab47e108efaf3"} Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.355610 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c09fa57d-4e50-47b5-bd3b-8f260fdebfff","Type":"ContainerStarted","Data":"844b452033fbc47dade8e08e05fdac5370c10a09958c05dcaf8198bc936d7d41"} Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.358232 4735 generic.go:334] "Generic (PLEG): container finished" podID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerID="1f2cdc80db3f365ff109b959e8ca3100183e528f37ba884a354a68ed6e9192b2" exitCode=0 Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.358300 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mdmq" event={"ID":"bc0a0b6f-c304-4350-ad88-813e4637cad7","Type":"ContainerDied","Data":"1f2cdc80db3f365ff109b959e8ca3100183e528f37ba884a354a68ed6e9192b2"} Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.358325 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mdmq" event={"ID":"bc0a0b6f-c304-4350-ad88-813e4637cad7","Type":"ContainerStarted","Data":"6fbf98357bca8c43bf0c1a855b0826d317bacbb8fbd206e615a0aba66a1505f3"} Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.362484 4735 generic.go:334] "Generic (PLEG): container finished" podID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerID="66714bd716eabed09655312e7538e60d8c24b82d4a96aef5c89906a05b5cebf8" exitCode=0 Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.362542 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfnd2" event={"ID":"6faf8566-a679-4c0f-866b-6f1b58c91769","Type":"ContainerDied","Data":"66714bd716eabed09655312e7538e60d8c24b82d4a96aef5c89906a05b5cebf8"} Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.362578 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfnd2" event={"ID":"6faf8566-a679-4c0f-866b-6f1b58c91769","Type":"ContainerStarted","Data":"4614eb40caf3cd529894337b50381e56a48bebbe447989488814f53fd2e2f748"} Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.366193 4735 generic.go:334] "Generic (PLEG): container finished" podID="5576eea1-c9d0-4751-a19a-6f664fecc2b7" containerID="49671bc7955b71dc35a6f1ed2adb36136f958f6ec3a66eb76753b194af00dab0" exitCode=0 Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.366230 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5576eea1-c9d0-4751-a19a-6f664fecc2b7","Type":"ContainerDied","Data":"49671bc7955b71dc35a6f1ed2adb36136f958f6ec3a66eb76753b194af00dab0"} Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.368471 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.368462169 podStartE2EDuration="1.368462169s" podCreationTimestamp="2025-12-09 15:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:07.365645021 +0000 UTC m=+146.290483650" watchObservedRunningTime="2025-12-09 15:01:07.368462169 +0000 UTC m=+146.293300797" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.386604 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.386677 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.386762 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.386806 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.388353 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.393882 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.393923 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.396955 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.398347 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.402212 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-t5fvv" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.525205 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.532809 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 09 15:01:07 crc kubenswrapper[4735]: I1209 15:01:07.536701 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:01:08 crc kubenswrapper[4735]: I1209 15:01:08.345364 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:01:08 crc kubenswrapper[4735]: I1209 15:01:08.347288 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-n8ljm" Dec 09 15:01:08 crc kubenswrapper[4735]: I1209 15:01:08.399551 4735 generic.go:334] "Generic (PLEG): container finished" podID="c09fa57d-4e50-47b5-bd3b-8f260fdebfff" containerID="8cdc0c5c7ac1f96eaa410630a85d35517607d391dd7f590b858ab47e108efaf3" exitCode=0 Dec 09 15:01:08 crc kubenswrapper[4735]: I1209 15:01:08.400854 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c09fa57d-4e50-47b5-bd3b-8f260fdebfff","Type":"ContainerDied","Data":"8cdc0c5c7ac1f96eaa410630a85d35517607d391dd7f590b858ab47e108efaf3"} Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.173907 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.178841 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.338412 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kube-api-access\") pod \"5576eea1-c9d0-4751-a19a-6f664fecc2b7\" (UID: \"5576eea1-c9d0-4751-a19a-6f664fecc2b7\") " Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.338557 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kubelet-dir\") pod \"5576eea1-c9d0-4751-a19a-6f664fecc2b7\" (UID: \"5576eea1-c9d0-4751-a19a-6f664fecc2b7\") " Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.338593 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kubelet-dir\") pod \"c09fa57d-4e50-47b5-bd3b-8f260fdebfff\" (UID: \"c09fa57d-4e50-47b5-bd3b-8f260fdebfff\") " Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.338647 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5576eea1-c9d0-4751-a19a-6f664fecc2b7" (UID: "5576eea1-c9d0-4751-a19a-6f664fecc2b7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.338670 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kube-api-access\") pod \"c09fa57d-4e50-47b5-bd3b-8f260fdebfff\" (UID: \"c09fa57d-4e50-47b5-bd3b-8f260fdebfff\") " Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.338697 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c09fa57d-4e50-47b5-bd3b-8f260fdebfff" (UID: "c09fa57d-4e50-47b5-bd3b-8f260fdebfff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.338990 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.339010 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.342724 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c09fa57d-4e50-47b5-bd3b-8f260fdebfff" (UID: "c09fa57d-4e50-47b5-bd3b-8f260fdebfff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.343476 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5576eea1-c9d0-4751-a19a-6f664fecc2b7" (UID: "5576eea1-c9d0-4751-a19a-6f664fecc2b7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.411523 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.412033 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5576eea1-c9d0-4751-a19a-6f664fecc2b7","Type":"ContainerDied","Data":"7d6a8cb0dc24004b6da33b65c61b2fcb8216f3d9d20c8a842d284d26399d40d9"} Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.412070 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6a8cb0dc24004b6da33b65c61b2fcb8216f3d9d20c8a842d284d26399d40d9" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.414998 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c09fa57d-4e50-47b5-bd3b-8f260fdebfff","Type":"ContainerDied","Data":"844b452033fbc47dade8e08e05fdac5370c10a09958c05dcaf8198bc936d7d41"} Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.415046 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="844b452033fbc47dade8e08e05fdac5370c10a09958c05dcaf8198bc936d7d41" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.415059 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.440141 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c09fa57d-4e50-47b5-bd3b-8f260fdebfff-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:10 crc kubenswrapper[4735]: I1209 15:01:10.440164 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5576eea1-c9d0-4751-a19a-6f664fecc2b7-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:10 crc kubenswrapper[4735]: W1209 15:01:10.512802 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-b7397990f5fa9e6ea27676f2b43ab234e4f3eec258083079c4cbba253ca07d0c WatchSource:0}: Error finding container b7397990f5fa9e6ea27676f2b43ab234e4f3eec258083079c4cbba253ca07d0c: Status 404 returned error can't find the container with id b7397990f5fa9e6ea27676f2b43ab234e4f3eec258083079c4cbba253ca07d0c Dec 09 15:01:10 crc kubenswrapper[4735]: W1209 15:01:10.549425 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-cccae8d604d401cc2470793e0e8fc130e8fdf5da88a8a4250fb680c462f991a4 WatchSource:0}: Error finding container cccae8d604d401cc2470793e0e8fc130e8fdf5da88a8a4250fb680c462f991a4: Status 404 returned error can't find the container with id cccae8d604d401cc2470793e0e8fc130e8fdf5da88a8a4250fb680c462f991a4 Dec 09 15:01:10 crc kubenswrapper[4735]: W1209 15:01:10.570344 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-e87aad15b1e9344736e4e7c77f40255e46cfa337a2159289fa9fa0622074150b WatchSource:0}: Error finding container e87aad15b1e9344736e4e7c77f40255e46cfa337a2159289fa9fa0622074150b: Status 404 returned error can't find the container with id e87aad15b1e9344736e4e7c77f40255e46cfa337a2159289fa9fa0622074150b Dec 09 15:01:11 crc kubenswrapper[4735]: I1209 15:01:11.442051 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a3d4db1e1a9a70cb15bc2d4ddb3072b19503e0dcc55a33a1dca9a38ede9ef559"} Dec 09 15:01:11 crc kubenswrapper[4735]: I1209 15:01:11.442348 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e87aad15b1e9344736e4e7c77f40255e46cfa337a2159289fa9fa0622074150b"} Dec 09 15:01:11 crc kubenswrapper[4735]: I1209 15:01:11.442546 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:01:11 crc kubenswrapper[4735]: I1209 15:01:11.445962 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1f073d9120b5e45111153cff41a0dcee6a976a88d225c7ab724f73310babbad8"} Dec 09 15:01:11 crc kubenswrapper[4735]: I1209 15:01:11.445994 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cccae8d604d401cc2470793e0e8fc130e8fdf5da88a8a4250fb680c462f991a4"} Dec 09 15:01:11 crc kubenswrapper[4735]: I1209 15:01:11.449940 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"395ec19d8c44ac07d5769d31d255c46a4f24c1df6bfdab7dc17a127ecc364834"} Dec 09 15:01:11 crc kubenswrapper[4735]: I1209 15:01:11.449985 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b7397990f5fa9e6ea27676f2b43ab234e4f3eec258083079c4cbba253ca07d0c"} Dec 09 15:01:12 crc kubenswrapper[4735]: I1209 15:01:12.480137 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8r6dv" Dec 09 15:01:16 crc kubenswrapper[4735]: I1209 15:01:16.903614 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:01:16 crc kubenswrapper[4735]: I1209 15:01:16.907026 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:01:20 crc kubenswrapper[4735]: I1209 15:01:20.700836 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:01:20 crc kubenswrapper[4735]: I1209 15:01:20.708445 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2-metrics-certs\") pod \"network-metrics-daemon-jw8pr\" (UID: \"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2\") " pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:01:20 crc kubenswrapper[4735]: I1209 15:01:20.925742 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jw8pr" Dec 09 15:01:22 crc kubenswrapper[4735]: I1209 15:01:22.720358 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:01:26 crc kubenswrapper[4735]: I1209 15:01:26.903962 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jw8pr"] Dec 09 15:01:26 crc kubenswrapper[4735]: W1209 15:01:26.928635 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bbc2b0d_8dfc_4dcb_b857_5d441818b6c2.slice/crio-3437b63390990f2eefa5e93c9bcfa987e2da170a5b041798395839edb9ca60a6 WatchSource:0}: Error finding container 3437b63390990f2eefa5e93c9bcfa987e2da170a5b041798395839edb9ca60a6: Status 404 returned error can't find the container with id 3437b63390990f2eefa5e93c9bcfa987e2da170a5b041798395839edb9ca60a6 Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.535180 4735 generic.go:334] "Generic (PLEG): container finished" podID="f273a98e-41b2-45ac-b140-8c73aaeeed54" containerID="186aa6f3082278424285eff33e64bc2b8cb65f6c280eb5c3858a6a09c372740c" exitCode=0 Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.535241 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68nhq" event={"ID":"f273a98e-41b2-45ac-b140-8c73aaeeed54","Type":"ContainerDied","Data":"186aa6f3082278424285eff33e64bc2b8cb65f6c280eb5c3858a6a09c372740c"} Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.538038 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" containerID="e4b2f737eee8a69ab907accb97cd102126a21c45ebf6b64d6ed21b6f390fde44" exitCode=0 Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.538140 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78zr9" event={"ID":"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8","Type":"ContainerDied","Data":"e4b2f737eee8a69ab907accb97cd102126a21c45ebf6b64d6ed21b6f390fde44"} Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.543546 4735 generic.go:334] "Generic (PLEG): container finished" podID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" containerID="37fab2c89492ca89fb97bf423e8435ecb1e47ba859251ba03c852beb634f47fc" exitCode=0 Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.543597 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578s6" event={"ID":"f032c5cf-0cbe-44c7-9523-1e19e33a31d4","Type":"ContainerDied","Data":"37fab2c89492ca89fb97bf423e8435ecb1e47ba859251ba03c852beb634f47fc"} Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.546543 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ea62e8b-5dc1-4527-8279-2845d3666202" containerID="3e350717b311eaf2b114d130b381141277ddb7e67c88f673b7c78cfd8f16b597" exitCode=0 Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.546618 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmc6l" event={"ID":"7ea62e8b-5dc1-4527-8279-2845d3666202","Type":"ContainerDied","Data":"3e350717b311eaf2b114d130b381141277ddb7e67c88f673b7c78cfd8f16b597"} Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.553767 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" event={"ID":"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2","Type":"ContainerStarted","Data":"7ae35fd5838df301c8d89921d0a95ce92623e3a5899cda8d2f0f754378c4a9cb"} Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.553809 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" event={"ID":"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2","Type":"ContainerStarted","Data":"d6512cbd3703f8fe2fdc89b7d3adcf6710b67840c9f95906b4c1e84deee35fdf"} Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.553820 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jw8pr" event={"ID":"6bbc2b0d-8dfc-4dcb-b857-5d441818b6c2","Type":"ContainerStarted","Data":"3437b63390990f2eefa5e93c9bcfa987e2da170a5b041798395839edb9ca60a6"} Dec 09 15:01:27 crc kubenswrapper[4735]: I1209 15:01:27.603128 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jw8pr" podStartSLOduration=149.6031143 podStartE2EDuration="2m29.6031143s" podCreationTimestamp="2025-12-09 14:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:27.601527312 +0000 UTC m=+166.526365940" watchObservedRunningTime="2025-12-09 15:01:27.6031143 +0000 UTC m=+166.527952928" Dec 09 15:01:28 crc kubenswrapper[4735]: I1209 15:01:28.563181 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmc6l" event={"ID":"7ea62e8b-5dc1-4527-8279-2845d3666202","Type":"ContainerStarted","Data":"283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe"} Dec 09 15:01:28 crc kubenswrapper[4735]: I1209 15:01:28.566934 4735 generic.go:334] "Generic (PLEG): container finished" podID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" containerID="2c419f279dcf4dea497303a653912e7b74e557edd51b158ccf011904a3fc3eca" exitCode=0 Dec 09 15:01:28 crc kubenswrapper[4735]: I1209 15:01:28.566974 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qzlx" event={"ID":"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914","Type":"ContainerDied","Data":"2c419f279dcf4dea497303a653912e7b74e557edd51b158ccf011904a3fc3eca"} Dec 09 15:01:28 crc kubenswrapper[4735]: I1209 15:01:28.569382 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68nhq" event={"ID":"f273a98e-41b2-45ac-b140-8c73aaeeed54","Type":"ContainerStarted","Data":"f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5"} Dec 09 15:01:28 crc kubenswrapper[4735]: I1209 15:01:28.574663 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78zr9" event={"ID":"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8","Type":"ContainerStarted","Data":"a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0"} Dec 09 15:01:28 crc kubenswrapper[4735]: I1209 15:01:28.578812 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578s6" event={"ID":"f032c5cf-0cbe-44c7-9523-1e19e33a31d4","Type":"ContainerStarted","Data":"550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf"} Dec 09 15:01:28 crc kubenswrapper[4735]: I1209 15:01:28.587903 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xmc6l" podStartSLOduration=2.7948342029999997 podStartE2EDuration="26.587894249s" podCreationTimestamp="2025-12-09 15:01:02 +0000 UTC" firstStartedPulling="2025-12-09 15:01:04.3148121 +0000 UTC m=+143.239650728" lastFinishedPulling="2025-12-09 15:01:28.107872146 +0000 UTC m=+167.032710774" observedRunningTime="2025-12-09 15:01:28.582990309 +0000 UTC m=+167.507828936" watchObservedRunningTime="2025-12-09 15:01:28.587894249 +0000 UTC m=+167.512732877" Dec 09 15:01:28 crc kubenswrapper[4735]: I1209 15:01:28.598262 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-68nhq" podStartSLOduration=2.815580407 podStartE2EDuration="24.598246999s" podCreationTimestamp="2025-12-09 15:01:04 +0000 UTC" firstStartedPulling="2025-12-09 15:01:06.343307056 +0000 UTC m=+145.268145684" lastFinishedPulling="2025-12-09 15:01:28.125973648 +0000 UTC m=+167.050812276" observedRunningTime="2025-12-09 15:01:28.597711628 +0000 UTC m=+167.522550256" watchObservedRunningTime="2025-12-09 15:01:28.598246999 +0000 UTC m=+167.523085627" Dec 09 15:01:28 crc kubenswrapper[4735]: I1209 15:01:28.611740 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-78zr9" podStartSLOduration=2.723807171 podStartE2EDuration="26.611725526s" podCreationTimestamp="2025-12-09 15:01:02 +0000 UTC" firstStartedPulling="2025-12-09 15:01:04.317066373 +0000 UTC m=+143.241905001" lastFinishedPulling="2025-12-09 15:01:28.204984728 +0000 UTC m=+167.129823356" observedRunningTime="2025-12-09 15:01:28.610366222 +0000 UTC m=+167.535204850" watchObservedRunningTime="2025-12-09 15:01:28.611725526 +0000 UTC m=+167.536564154" Dec 09 15:01:29 crc kubenswrapper[4735]: I1209 15:01:29.587397 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qzlx" event={"ID":"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914","Type":"ContainerStarted","Data":"669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6"} Dec 09 15:01:29 crc kubenswrapper[4735]: I1209 15:01:29.590889 4735 generic.go:334] "Generic (PLEG): container finished" podID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerID="07bc4c59c960c4c2cb4e868bbf484d032dfb525ff71f5cfb0191415f6bea3c3a" exitCode=0 Dec 09 15:01:29 crc kubenswrapper[4735]: I1209 15:01:29.591786 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krqpw" event={"ID":"f0bcff89-5f52-43dc-ab10-6704a9143fbe","Type":"ContainerDied","Data":"07bc4c59c960c4c2cb4e868bbf484d032dfb525ff71f5cfb0191415f6bea3c3a"} Dec 09 15:01:29 crc kubenswrapper[4735]: I1209 15:01:29.606463 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-578s6" podStartSLOduration=2.852385639 podStartE2EDuration="26.606445296s" podCreationTimestamp="2025-12-09 15:01:03 +0000 UTC" firstStartedPulling="2025-12-09 15:01:04.312119971 +0000 UTC m=+143.236958599" lastFinishedPulling="2025-12-09 15:01:28.066179627 +0000 UTC m=+166.991018256" observedRunningTime="2025-12-09 15:01:28.637553044 +0000 UTC m=+167.562391672" watchObservedRunningTime="2025-12-09 15:01:29.606445296 +0000 UTC m=+168.531283924" Dec 09 15:01:29 crc kubenswrapper[4735]: I1209 15:01:29.626739 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7qzlx" podStartSLOduration=1.8441875410000002 podStartE2EDuration="24.626725325s" podCreationTimestamp="2025-12-09 15:01:05 +0000 UTC" firstStartedPulling="2025-12-09 15:01:06.348861799 +0000 UTC m=+145.273700427" lastFinishedPulling="2025-12-09 15:01:29.131399582 +0000 UTC m=+168.056238211" observedRunningTime="2025-12-09 15:01:29.607503896 +0000 UTC m=+168.532342524" watchObservedRunningTime="2025-12-09 15:01:29.626725325 +0000 UTC m=+168.551563954" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.135590 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.135942 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.222100 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.315195 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.315238 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.344732 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.527124 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.527205 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.558310 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.619761 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krqpw" event={"ID":"f0bcff89-5f52-43dc-ab10-6704a9143fbe","Type":"ContainerStarted","Data":"e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690"} Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.627399 4735 generic.go:334] "Generic (PLEG): container finished" podID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerID="4ac834a91a96ee4b8df3a5756c993a2e182f1eea11d8d84cca0f27d65b0ab891" exitCode=0 Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.627464 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfnd2" event={"ID":"6faf8566-a679-4c0f-866b-6f1b58c91769","Type":"ContainerDied","Data":"4ac834a91a96ee4b8df3a5756c993a2e182f1eea11d8d84cca0f27d65b0ab891"} Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.629233 4735 generic.go:334] "Generic (PLEG): container finished" podID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerID="a041172b7eb4483375205a6d8d522f9fcc0f61d543260684fdde3cd324db0f58" exitCode=0 Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.630155 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mdmq" event={"ID":"bc0a0b6f-c304-4350-ad88-813e4637cad7","Type":"ContainerDied","Data":"a041172b7eb4483375205a6d8d522f9fcc0f61d543260684fdde3cd324db0f58"} Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.637232 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-krqpw" podStartSLOduration=2.189364462 podStartE2EDuration="30.637223417s" podCreationTimestamp="2025-12-09 15:01:03 +0000 UTC" firstStartedPulling="2025-12-09 15:01:04.313086696 +0000 UTC m=+143.237925324" lastFinishedPulling="2025-12-09 15:01:32.760945652 +0000 UTC m=+171.685784279" observedRunningTime="2025-12-09 15:01:33.637125531 +0000 UTC m=+172.561964159" watchObservedRunningTime="2025-12-09 15:01:33.637223417 +0000 UTC m=+172.562062046" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.667331 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.672756 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.673905 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.714349 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:33 crc kubenswrapper[4735]: I1209 15:01:33.714819 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:34 crc kubenswrapper[4735]: I1209 15:01:34.335969 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:01:34 crc kubenswrapper[4735]: I1209 15:01:34.336029 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:01:34 crc kubenswrapper[4735]: I1209 15:01:34.635289 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfnd2" event={"ID":"6faf8566-a679-4c0f-866b-6f1b58c91769","Type":"ContainerStarted","Data":"bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974"} Dec 09 15:01:34 crc kubenswrapper[4735]: I1209 15:01:34.637020 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mdmq" event={"ID":"bc0a0b6f-c304-4350-ad88-813e4637cad7","Type":"ContainerStarted","Data":"c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413"} Dec 09 15:01:34 crc kubenswrapper[4735]: I1209 15:01:34.649483 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tfnd2" podStartSLOduration=1.79703066 podStartE2EDuration="28.649469321s" podCreationTimestamp="2025-12-09 15:01:06 +0000 UTC" firstStartedPulling="2025-12-09 15:01:07.363426938 +0000 UTC m=+146.288265566" lastFinishedPulling="2025-12-09 15:01:34.215865599 +0000 UTC m=+173.140704227" observedRunningTime="2025-12-09 15:01:34.647780858 +0000 UTC m=+173.572619486" watchObservedRunningTime="2025-12-09 15:01:34.649469321 +0000 UTC m=+173.574307939" Dec 09 15:01:34 crc kubenswrapper[4735]: I1209 15:01:34.660303 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4mdmq" podStartSLOduration=2.7213745129999998 podStartE2EDuration="29.660294594s" podCreationTimestamp="2025-12-09 15:01:05 +0000 UTC" firstStartedPulling="2025-12-09 15:01:07.35955111 +0000 UTC m=+146.284389738" lastFinishedPulling="2025-12-09 15:01:34.29847119 +0000 UTC m=+173.223309819" observedRunningTime="2025-12-09 15:01:34.660104952 +0000 UTC m=+173.584943600" watchObservedRunningTime="2025-12-09 15:01:34.660294594 +0000 UTC m=+173.585133221" Dec 09 15:01:34 crc kubenswrapper[4735]: I1209 15:01:34.740447 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-krqpw" podUID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerName="registry-server" probeResult="failure" output=< Dec 09 15:01:34 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Dec 09 15:01:34 crc kubenswrapper[4735]: > Dec 09 15:01:35 crc kubenswrapper[4735]: I1209 15:01:35.326199 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:35 crc kubenswrapper[4735]: I1209 15:01:35.330009 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:35 crc kubenswrapper[4735]: I1209 15:01:35.365488 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:35 crc kubenswrapper[4735]: I1209 15:01:35.668273 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:01:35 crc kubenswrapper[4735]: I1209 15:01:35.728752 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:35 crc kubenswrapper[4735]: I1209 15:01:35.728869 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:35 crc kubenswrapper[4735]: I1209 15:01:35.753087 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:35 crc kubenswrapper[4735]: I1209 15:01:35.957973 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-578s6"] Dec 09 15:01:35 crc kubenswrapper[4735]: I1209 15:01:35.958170 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-578s6" podUID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" containerName="registry-server" containerID="cri-o://550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf" gracePeriod=2 Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.237188 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.314302 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.314345 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.389194 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwkdw\" (UniqueName: \"kubernetes.io/projected/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-kube-api-access-zwkdw\") pod \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.389257 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-catalog-content\") pod \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.389278 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-utilities\") pod \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\" (UID: \"f032c5cf-0cbe-44c7-9523-1e19e33a31d4\") " Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.390123 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-utilities" (OuterVolumeSpecName: "utilities") pod "f032c5cf-0cbe-44c7-9523-1e19e33a31d4" (UID: "f032c5cf-0cbe-44c7-9523-1e19e33a31d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.395808 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-kube-api-access-zwkdw" (OuterVolumeSpecName: "kube-api-access-zwkdw") pod "f032c5cf-0cbe-44c7-9523-1e19e33a31d4" (UID: "f032c5cf-0cbe-44c7-9523-1e19e33a31d4"). InnerVolumeSpecName "kube-api-access-zwkdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.430100 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f032c5cf-0cbe-44c7-9523-1e19e33a31d4" (UID: "f032c5cf-0cbe-44c7-9523-1e19e33a31d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.491117 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwkdw\" (UniqueName: \"kubernetes.io/projected/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-kube-api-access-zwkdw\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.491143 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.491153 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f032c5cf-0cbe-44c7-9523-1e19e33a31d4-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.645912 4735 generic.go:334] "Generic (PLEG): container finished" podID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" containerID="550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf" exitCode=0 Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.646159 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578s6" event={"ID":"f032c5cf-0cbe-44c7-9523-1e19e33a31d4","Type":"ContainerDied","Data":"550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf"} Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.646214 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-578s6" event={"ID":"f032c5cf-0cbe-44c7-9523-1e19e33a31d4","Type":"ContainerDied","Data":"9ded272b5cdf5d58e74b275c7fe286523e9811528c2594636c7584639018e0f8"} Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.646234 4735 scope.go:117] "RemoveContainer" containerID="550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.646369 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-578s6" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.661938 4735 scope.go:117] "RemoveContainer" containerID="37fab2c89492ca89fb97bf423e8435ecb1e47ba859251ba03c852beb634f47fc" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.667195 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-578s6"] Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.670066 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-578s6"] Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.690154 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.701783 4735 scope.go:117] "RemoveContainer" containerID="663edbab8a74ba6c1e77ee694d8b5b89c64263a29d05c3955d3f629d3c0dbf42" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.718505 4735 scope.go:117] "RemoveContainer" containerID="550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf" Dec 09 15:01:36 crc kubenswrapper[4735]: E1209 15:01:36.718804 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf\": container with ID starting with 550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf not found: ID does not exist" containerID="550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.718841 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf"} err="failed to get container status \"550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf\": rpc error: code = NotFound desc = could not find container \"550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf\": container with ID starting with 550e62581d20c38f4e29d9a43ed304f899c8cbae78b07daec7c607cbe05aafbf not found: ID does not exist" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.718878 4735 scope.go:117] "RemoveContainer" containerID="37fab2c89492ca89fb97bf423e8435ecb1e47ba859251ba03c852beb634f47fc" Dec 09 15:01:36 crc kubenswrapper[4735]: E1209 15:01:36.719118 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37fab2c89492ca89fb97bf423e8435ecb1e47ba859251ba03c852beb634f47fc\": container with ID starting with 37fab2c89492ca89fb97bf423e8435ecb1e47ba859251ba03c852beb634f47fc not found: ID does not exist" containerID="37fab2c89492ca89fb97bf423e8435ecb1e47ba859251ba03c852beb634f47fc" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.719146 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37fab2c89492ca89fb97bf423e8435ecb1e47ba859251ba03c852beb634f47fc"} err="failed to get container status \"37fab2c89492ca89fb97bf423e8435ecb1e47ba859251ba03c852beb634f47fc\": rpc error: code = NotFound desc = could not find container \"37fab2c89492ca89fb97bf423e8435ecb1e47ba859251ba03c852beb634f47fc\": container with ID starting with 37fab2c89492ca89fb97bf423e8435ecb1e47ba859251ba03c852beb634f47fc not found: ID does not exist" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.719163 4735 scope.go:117] "RemoveContainer" containerID="663edbab8a74ba6c1e77ee694d8b5b89c64263a29d05c3955d3f629d3c0dbf42" Dec 09 15:01:36 crc kubenswrapper[4735]: E1209 15:01:36.719500 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663edbab8a74ba6c1e77ee694d8b5b89c64263a29d05c3955d3f629d3c0dbf42\": container with ID starting with 663edbab8a74ba6c1e77ee694d8b5b89c64263a29d05c3955d3f629d3c0dbf42 not found: ID does not exist" containerID="663edbab8a74ba6c1e77ee694d8b5b89c64263a29d05c3955d3f629d3c0dbf42" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.719582 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663edbab8a74ba6c1e77ee694d8b5b89c64263a29d05c3955d3f629d3c0dbf42"} err="failed to get container status \"663edbab8a74ba6c1e77ee694d8b5b89c64263a29d05c3955d3f629d3c0dbf42\": rpc error: code = NotFound desc = could not find container \"663edbab8a74ba6c1e77ee694d8b5b89c64263a29d05c3955d3f629d3c0dbf42\": container with ID starting with 663edbab8a74ba6c1e77ee694d8b5b89c64263a29d05c3955d3f629d3c0dbf42 not found: ID does not exist" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.760661 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:36 crc kubenswrapper[4735]: I1209 15:01:36.760709 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:37 crc kubenswrapper[4735]: I1209 15:01:37.339832 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4mdmq" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerName="registry-server" probeResult="failure" output=< Dec 09 15:01:37 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Dec 09 15:01:37 crc kubenswrapper[4735]: > Dec 09 15:01:37 crc kubenswrapper[4735]: I1209 15:01:37.418423 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" path="/var/lib/kubelet/pods/f032c5cf-0cbe-44c7-9523-1e19e33a31d4/volumes" Dec 09 15:01:37 crc kubenswrapper[4735]: I1209 15:01:37.685673 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vsnk7" Dec 09 15:01:37 crc kubenswrapper[4735]: I1209 15:01:37.790109 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tfnd2" podUID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerName="registry-server" probeResult="failure" output=< Dec 09 15:01:37 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Dec 09 15:01:37 crc kubenswrapper[4735]: > Dec 09 15:01:39 crc kubenswrapper[4735]: I1209 15:01:39.758293 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qzlx"] Dec 09 15:01:39 crc kubenswrapper[4735]: I1209 15:01:39.759388 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7qzlx" podUID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" containerName="registry-server" containerID="cri-o://669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6" gracePeriod=2 Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.042034 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.234169 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-utilities\") pod \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.234284 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-catalog-content\") pod \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.234343 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjxqg\" (UniqueName: \"kubernetes.io/projected/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-kube-api-access-xjxqg\") pod \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\" (UID: \"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914\") " Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.234895 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-utilities" (OuterVolumeSpecName: "utilities") pod "ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" (UID: "ba7db0c8-c525-4eb8-8a40-4e0ad2d00914"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.237900 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-kube-api-access-xjxqg" (OuterVolumeSpecName: "kube-api-access-xjxqg") pod "ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" (UID: "ba7db0c8-c525-4eb8-8a40-4e0ad2d00914"). InnerVolumeSpecName "kube-api-access-xjxqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.248898 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" (UID: "ba7db0c8-c525-4eb8-8a40-4e0ad2d00914"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.335233 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.335259 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjxqg\" (UniqueName: \"kubernetes.io/projected/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-kube-api-access-xjxqg\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.335269 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.665297 4735 generic.go:334] "Generic (PLEG): container finished" podID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" containerID="669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6" exitCode=0 Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.665330 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qzlx" event={"ID":"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914","Type":"ContainerDied","Data":"669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6"} Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.665352 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7qzlx" event={"ID":"ba7db0c8-c525-4eb8-8a40-4e0ad2d00914","Type":"ContainerDied","Data":"73fa931fded0a317b390a5870bd367995699134a845b7a8de96e4735bca97a43"} Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.665334 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7qzlx" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.665369 4735 scope.go:117] "RemoveContainer" containerID="669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.676616 4735 scope.go:117] "RemoveContainer" containerID="2c419f279dcf4dea497303a653912e7b74e557edd51b158ccf011904a3fc3eca" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.683799 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qzlx"] Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.685708 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7qzlx"] Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.690367 4735 scope.go:117] "RemoveContainer" containerID="b0a3a145d9bd5a1626755b5a73b5d360acf2acc74c0086cd35e5b6e473488528" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.719061 4735 scope.go:117] "RemoveContainer" containerID="669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6" Dec 09 15:01:40 crc kubenswrapper[4735]: E1209 15:01:40.719344 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6\": container with ID starting with 669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6 not found: ID does not exist" containerID="669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.719374 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6"} err="failed to get container status \"669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6\": rpc error: code = NotFound desc = could not find container \"669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6\": container with ID starting with 669860378ef2e03f628cec0315b1e9c096605c1ed61e7778ea7c8c624cbeb1c6 not found: ID does not exist" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.719395 4735 scope.go:117] "RemoveContainer" containerID="2c419f279dcf4dea497303a653912e7b74e557edd51b158ccf011904a3fc3eca" Dec 09 15:01:40 crc kubenswrapper[4735]: E1209 15:01:40.719773 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c419f279dcf4dea497303a653912e7b74e557edd51b158ccf011904a3fc3eca\": container with ID starting with 2c419f279dcf4dea497303a653912e7b74e557edd51b158ccf011904a3fc3eca not found: ID does not exist" containerID="2c419f279dcf4dea497303a653912e7b74e557edd51b158ccf011904a3fc3eca" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.719816 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c419f279dcf4dea497303a653912e7b74e557edd51b158ccf011904a3fc3eca"} err="failed to get container status \"2c419f279dcf4dea497303a653912e7b74e557edd51b158ccf011904a3fc3eca\": rpc error: code = NotFound desc = could not find container \"2c419f279dcf4dea497303a653912e7b74e557edd51b158ccf011904a3fc3eca\": container with ID starting with 2c419f279dcf4dea497303a653912e7b74e557edd51b158ccf011904a3fc3eca not found: ID does not exist" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.719837 4735 scope.go:117] "RemoveContainer" containerID="b0a3a145d9bd5a1626755b5a73b5d360acf2acc74c0086cd35e5b6e473488528" Dec 09 15:01:40 crc kubenswrapper[4735]: E1209 15:01:40.720071 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0a3a145d9bd5a1626755b5a73b5d360acf2acc74c0086cd35e5b6e473488528\": container with ID starting with b0a3a145d9bd5a1626755b5a73b5d360acf2acc74c0086cd35e5b6e473488528 not found: ID does not exist" containerID="b0a3a145d9bd5a1626755b5a73b5d360acf2acc74c0086cd35e5b6e473488528" Dec 09 15:01:40 crc kubenswrapper[4735]: I1209 15:01:40.720092 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0a3a145d9bd5a1626755b5a73b5d360acf2acc74c0086cd35e5b6e473488528"} err="failed to get container status \"b0a3a145d9bd5a1626755b5a73b5d360acf2acc74c0086cd35e5b6e473488528\": rpc error: code = NotFound desc = could not find container \"b0a3a145d9bd5a1626755b5a73b5d360acf2acc74c0086cd35e5b6e473488528\": container with ID starting with b0a3a145d9bd5a1626755b5a73b5d360acf2acc74c0086cd35e5b6e473488528 not found: ID does not exist" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378236 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 15:01:41 crc kubenswrapper[4735]: E1209 15:01:41.378407 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" containerName="registry-server" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378417 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" containerName="registry-server" Dec 09 15:01:41 crc kubenswrapper[4735]: E1209 15:01:41.378429 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" containerName="extract-content" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378434 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" containerName="extract-content" Dec 09 15:01:41 crc kubenswrapper[4735]: E1209 15:01:41.378445 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" containerName="extract-utilities" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378451 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" containerName="extract-utilities" Dec 09 15:01:41 crc kubenswrapper[4735]: E1209 15:01:41.378460 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" containerName="extract-utilities" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378465 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" containerName="extract-utilities" Dec 09 15:01:41 crc kubenswrapper[4735]: E1209 15:01:41.378473 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5576eea1-c9d0-4751-a19a-6f664fecc2b7" containerName="pruner" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378477 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5576eea1-c9d0-4751-a19a-6f664fecc2b7" containerName="pruner" Dec 09 15:01:41 crc kubenswrapper[4735]: E1209 15:01:41.378485 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" containerName="extract-content" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378490 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" containerName="extract-content" Dec 09 15:01:41 crc kubenswrapper[4735]: E1209 15:01:41.378501 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09fa57d-4e50-47b5-bd3b-8f260fdebfff" containerName="pruner" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378506 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09fa57d-4e50-47b5-bd3b-8f260fdebfff" containerName="pruner" Dec 09 15:01:41 crc kubenswrapper[4735]: E1209 15:01:41.378529 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" containerName="registry-server" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378535 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" containerName="registry-server" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378608 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" containerName="registry-server" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378617 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f032c5cf-0cbe-44c7-9523-1e19e33a31d4" containerName="registry-server" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378627 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5576eea1-c9d0-4751-a19a-6f664fecc2b7" containerName="pruner" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378634 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09fa57d-4e50-47b5-bd3b-8f260fdebfff" containerName="pruner" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.378930 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.382851 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.389327 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.402328 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.418766 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7db0c8-c525-4eb8-8a40-4e0ad2d00914" path="/var/lib/kubelet/pods/ba7db0c8-c525-4eb8-8a40-4e0ad2d00914/volumes" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.545843 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed2b754b-debe-4858-b7b2-c92683dcb499-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ed2b754b-debe-4858-b7b2-c92683dcb499\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.546072 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed2b754b-debe-4858-b7b2-c92683dcb499-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ed2b754b-debe-4858-b7b2-c92683dcb499\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.646931 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed2b754b-debe-4858-b7b2-c92683dcb499-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ed2b754b-debe-4858-b7b2-c92683dcb499\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.646982 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed2b754b-debe-4858-b7b2-c92683dcb499-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ed2b754b-debe-4858-b7b2-c92683dcb499\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.647074 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed2b754b-debe-4858-b7b2-c92683dcb499-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ed2b754b-debe-4858-b7b2-c92683dcb499\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.659667 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed2b754b-debe-4858-b7b2-c92683dcb499-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ed2b754b-debe-4858-b7b2-c92683dcb499\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:41 crc kubenswrapper[4735]: I1209 15:01:41.707781 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:42 crc kubenswrapper[4735]: I1209 15:01:42.038926 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 09 15:01:42 crc kubenswrapper[4735]: I1209 15:01:42.672875 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ed2b754b-debe-4858-b7b2-c92683dcb499","Type":"ContainerStarted","Data":"01711cc8ac70611eb7e0d0cdb1149f57af2672bbda8a443d6285a1fe9838d1e0"} Dec 09 15:01:42 crc kubenswrapper[4735]: I1209 15:01:42.673111 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ed2b754b-debe-4858-b7b2-c92683dcb499","Type":"ContainerStarted","Data":"e5dca1a3cdb551ebe1ae451630c42f33742ba515bfdf3df04cca07fd78e73a74"} Dec 09 15:01:42 crc kubenswrapper[4735]: I1209 15:01:42.682932 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.682917805 podStartE2EDuration="1.682917805s" podCreationTimestamp="2025-12-09 15:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:42.681406981 +0000 UTC m=+181.606245609" watchObservedRunningTime="2025-12-09 15:01:42.682917805 +0000 UTC m=+181.607756432" Dec 09 15:01:43 crc kubenswrapper[4735]: I1209 15:01:43.677217 4735 generic.go:334] "Generic (PLEG): container finished" podID="ed2b754b-debe-4858-b7b2-c92683dcb499" containerID="01711cc8ac70611eb7e0d0cdb1149f57af2672bbda8a443d6285a1fe9838d1e0" exitCode=0 Dec 09 15:01:43 crc kubenswrapper[4735]: I1209 15:01:43.677316 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ed2b754b-debe-4858-b7b2-c92683dcb499","Type":"ContainerDied","Data":"01711cc8ac70611eb7e0d0cdb1149f57af2672bbda8a443d6285a1fe9838d1e0"} Dec 09 15:01:43 crc kubenswrapper[4735]: I1209 15:01:43.743065 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:43 crc kubenswrapper[4735]: I1209 15:01:43.771167 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:44 crc kubenswrapper[4735]: I1209 15:01:44.846091 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:44 crc kubenswrapper[4735]: I1209 15:01:44.982091 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed2b754b-debe-4858-b7b2-c92683dcb499-kube-api-access\") pod \"ed2b754b-debe-4858-b7b2-c92683dcb499\" (UID: \"ed2b754b-debe-4858-b7b2-c92683dcb499\") " Dec 09 15:01:44 crc kubenswrapper[4735]: I1209 15:01:44.982128 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed2b754b-debe-4858-b7b2-c92683dcb499-kubelet-dir\") pod \"ed2b754b-debe-4858-b7b2-c92683dcb499\" (UID: \"ed2b754b-debe-4858-b7b2-c92683dcb499\") " Dec 09 15:01:44 crc kubenswrapper[4735]: I1209 15:01:44.982280 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed2b754b-debe-4858-b7b2-c92683dcb499-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ed2b754b-debe-4858-b7b2-c92683dcb499" (UID: "ed2b754b-debe-4858-b7b2-c92683dcb499"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:01:44 crc kubenswrapper[4735]: I1209 15:01:44.986034 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2b754b-debe-4858-b7b2-c92683dcb499-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ed2b754b-debe-4858-b7b2-c92683dcb499" (UID: "ed2b754b-debe-4858-b7b2-c92683dcb499"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:45 crc kubenswrapper[4735]: I1209 15:01:45.083554 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed2b754b-debe-4858-b7b2-c92683dcb499-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:45 crc kubenswrapper[4735]: I1209 15:01:45.083582 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed2b754b-debe-4858-b7b2-c92683dcb499-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:45 crc kubenswrapper[4735]: I1209 15:01:45.158557 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krqpw"] Dec 09 15:01:45 crc kubenswrapper[4735]: I1209 15:01:45.515085 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxscb"] Dec 09 15:01:45 crc kubenswrapper[4735]: I1209 15:01:45.685923 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-krqpw" podUID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerName="registry-server" containerID="cri-o://e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690" gracePeriod=2 Dec 09 15:01:45 crc kubenswrapper[4735]: I1209 15:01:45.686182 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 09 15:01:45 crc kubenswrapper[4735]: I1209 15:01:45.686583 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ed2b754b-debe-4858-b7b2-c92683dcb499","Type":"ContainerDied","Data":"e5dca1a3cdb551ebe1ae451630c42f33742ba515bfdf3df04cca07fd78e73a74"} Dec 09 15:01:45 crc kubenswrapper[4735]: I1209 15:01:45.686603 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5dca1a3cdb551ebe1ae451630c42f33742ba515bfdf3df04cca07fd78e73a74" Dec 09 15:01:45 crc kubenswrapper[4735]: I1209 15:01:45.954399 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.092736 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4kzj\" (UniqueName: \"kubernetes.io/projected/f0bcff89-5f52-43dc-ab10-6704a9143fbe-kube-api-access-v4kzj\") pod \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.092908 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-utilities\") pod \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.092949 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-catalog-content\") pod \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\" (UID: \"f0bcff89-5f52-43dc-ab10-6704a9143fbe\") " Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.093558 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-utilities" (OuterVolumeSpecName: "utilities") pod "f0bcff89-5f52-43dc-ab10-6704a9143fbe" (UID: "f0bcff89-5f52-43dc-ab10-6704a9143fbe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.097238 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0bcff89-5f52-43dc-ab10-6704a9143fbe-kube-api-access-v4kzj" (OuterVolumeSpecName: "kube-api-access-v4kzj") pod "f0bcff89-5f52-43dc-ab10-6704a9143fbe" (UID: "f0bcff89-5f52-43dc-ab10-6704a9143fbe"). InnerVolumeSpecName "kube-api-access-v4kzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.130993 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0bcff89-5f52-43dc-ab10-6704a9143fbe" (UID: "f0bcff89-5f52-43dc-ab10-6704a9143fbe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.193726 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.193757 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0bcff89-5f52-43dc-ab10-6704a9143fbe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.193769 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4kzj\" (UniqueName: \"kubernetes.io/projected/f0bcff89-5f52-43dc-ab10-6704a9143fbe-kube-api-access-v4kzj\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.340885 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.367117 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.691995 4735 generic.go:334] "Generic (PLEG): container finished" podID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerID="e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690" exitCode=0 Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.692038 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krqpw" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.692077 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krqpw" event={"ID":"f0bcff89-5f52-43dc-ab10-6704a9143fbe","Type":"ContainerDied","Data":"e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690"} Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.692115 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krqpw" event={"ID":"f0bcff89-5f52-43dc-ab10-6704a9143fbe","Type":"ContainerDied","Data":"7ba9323c5e5364a2a48f5945c76cd48d2d29bdbe68fdf2318a293a6b8c41540d"} Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.692132 4735 scope.go:117] "RemoveContainer" containerID="e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.712021 4735 scope.go:117] "RemoveContainer" containerID="07bc4c59c960c4c2cb4e868bbf484d032dfb525ff71f5cfb0191415f6bea3c3a" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.717595 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krqpw"] Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.719936 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-krqpw"] Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.730941 4735 scope.go:117] "RemoveContainer" containerID="9252b8064543086ef061edeefbb362d1c96e985e1df2a399cd75c482a83923a1" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.742413 4735 scope.go:117] "RemoveContainer" containerID="e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690" Dec 09 15:01:46 crc kubenswrapper[4735]: E1209 15:01:46.743740 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690\": container with ID starting with e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690 not found: ID does not exist" containerID="e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.743773 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690"} err="failed to get container status \"e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690\": rpc error: code = NotFound desc = could not find container \"e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690\": container with ID starting with e5e463e55074cea40b9d9697a1b3dca5f6416a0e847cc798752d471113910690 not found: ID does not exist" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.743793 4735 scope.go:117] "RemoveContainer" containerID="07bc4c59c960c4c2cb4e868bbf484d032dfb525ff71f5cfb0191415f6bea3c3a" Dec 09 15:01:46 crc kubenswrapper[4735]: E1209 15:01:46.744039 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07bc4c59c960c4c2cb4e868bbf484d032dfb525ff71f5cfb0191415f6bea3c3a\": container with ID starting with 07bc4c59c960c4c2cb4e868bbf484d032dfb525ff71f5cfb0191415f6bea3c3a not found: ID does not exist" containerID="07bc4c59c960c4c2cb4e868bbf484d032dfb525ff71f5cfb0191415f6bea3c3a" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.744068 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07bc4c59c960c4c2cb4e868bbf484d032dfb525ff71f5cfb0191415f6bea3c3a"} err="failed to get container status \"07bc4c59c960c4c2cb4e868bbf484d032dfb525ff71f5cfb0191415f6bea3c3a\": rpc error: code = NotFound desc = could not find container \"07bc4c59c960c4c2cb4e868bbf484d032dfb525ff71f5cfb0191415f6bea3c3a\": container with ID starting with 07bc4c59c960c4c2cb4e868bbf484d032dfb525ff71f5cfb0191415f6bea3c3a not found: ID does not exist" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.744089 4735 scope.go:117] "RemoveContainer" containerID="9252b8064543086ef061edeefbb362d1c96e985e1df2a399cd75c482a83923a1" Dec 09 15:01:46 crc kubenswrapper[4735]: E1209 15:01:46.744310 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9252b8064543086ef061edeefbb362d1c96e985e1df2a399cd75c482a83923a1\": container with ID starting with 9252b8064543086ef061edeefbb362d1c96e985e1df2a399cd75c482a83923a1 not found: ID does not exist" containerID="9252b8064543086ef061edeefbb362d1c96e985e1df2a399cd75c482a83923a1" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.744342 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9252b8064543086ef061edeefbb362d1c96e985e1df2a399cd75c482a83923a1"} err="failed to get container status \"9252b8064543086ef061edeefbb362d1c96e985e1df2a399cd75c482a83923a1\": rpc error: code = NotFound desc = could not find container \"9252b8064543086ef061edeefbb362d1c96e985e1df2a399cd75c482a83923a1\": container with ID starting with 9252b8064543086ef061edeefbb362d1c96e985e1df2a399cd75c482a83923a1 not found: ID does not exist" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.792261 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:46 crc kubenswrapper[4735]: I1209 15:01:46.820689 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:47 crc kubenswrapper[4735]: I1209 15:01:47.419557 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" path="/var/lib/kubelet/pods/f0bcff89-5f52-43dc-ab10-6704a9143fbe/volumes" Dec 09 15:01:47 crc kubenswrapper[4735]: I1209 15:01:47.541448 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.179786 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 15:01:48 crc kubenswrapper[4735]: E1209 15:01:48.180394 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2b754b-debe-4858-b7b2-c92683dcb499" containerName="pruner" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.180411 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b754b-debe-4858-b7b2-c92683dcb499" containerName="pruner" Dec 09 15:01:48 crc kubenswrapper[4735]: E1209 15:01:48.180423 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerName="registry-server" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.180429 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerName="registry-server" Dec 09 15:01:48 crc kubenswrapper[4735]: E1209 15:01:48.180440 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerName="extract-content" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.180446 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerName="extract-content" Dec 09 15:01:48 crc kubenswrapper[4735]: E1209 15:01:48.180457 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerName="extract-utilities" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.180463 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerName="extract-utilities" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.180613 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0bcff89-5f52-43dc-ab10-6704a9143fbe" containerName="registry-server" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.181282 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2b754b-debe-4858-b7b2-c92683dcb499" containerName="pruner" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.182386 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.184415 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.184778 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.186544 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.213366 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/508b0d5f-7739-45b0-be00-16110455e3e3-kube-api-access\") pod \"installer-9-crc\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.213417 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-var-lock\") pod \"installer-9-crc\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.213535 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.314876 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.314951 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/508b0d5f-7739-45b0-be00-16110455e3e3-kube-api-access\") pod \"installer-9-crc\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.315003 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-var-lock\") pod \"installer-9-crc\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.315016 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.315181 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-var-lock\") pod \"installer-9-crc\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.334470 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/508b0d5f-7739-45b0-be00-16110455e3e3-kube-api-access\") pod \"installer-9-crc\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.497919 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.759305 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tfnd2"] Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.759662 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tfnd2" podUID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerName="registry-server" containerID="cri-o://bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974" gracePeriod=2 Dec 09 15:01:48 crc kubenswrapper[4735]: I1209 15:01:48.857283 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 09 15:01:48 crc kubenswrapper[4735]: W1209 15:01:48.872211 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod508b0d5f_7739_45b0_be00_16110455e3e3.slice/crio-beda907434d5ed6bdecf5562621f0652898131cf308da2a5fae8a5d00d05923a WatchSource:0}: Error finding container beda907434d5ed6bdecf5562621f0652898131cf308da2a5fae8a5d00d05923a: Status 404 returned error can't find the container with id beda907434d5ed6bdecf5562621f0652898131cf308da2a5fae8a5d00d05923a Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.002338 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.022156 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-catalog-content\") pod \"6faf8566-a679-4c0f-866b-6f1b58c91769\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.022197 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr6lq\" (UniqueName: \"kubernetes.io/projected/6faf8566-a679-4c0f-866b-6f1b58c91769-kube-api-access-lr6lq\") pod \"6faf8566-a679-4c0f-866b-6f1b58c91769\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.022222 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-utilities\") pod \"6faf8566-a679-4c0f-866b-6f1b58c91769\" (UID: \"6faf8566-a679-4c0f-866b-6f1b58c91769\") " Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.023235 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-utilities" (OuterVolumeSpecName: "utilities") pod "6faf8566-a679-4c0f-866b-6f1b58c91769" (UID: "6faf8566-a679-4c0f-866b-6f1b58c91769"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.026929 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6faf8566-a679-4c0f-866b-6f1b58c91769-kube-api-access-lr6lq" (OuterVolumeSpecName: "kube-api-access-lr6lq") pod "6faf8566-a679-4c0f-866b-6f1b58c91769" (UID: "6faf8566-a679-4c0f-866b-6f1b58c91769"). InnerVolumeSpecName "kube-api-access-lr6lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.105168 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6faf8566-a679-4c0f-866b-6f1b58c91769" (UID: "6faf8566-a679-4c0f-866b-6f1b58c91769"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.123768 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.123797 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr6lq\" (UniqueName: \"kubernetes.io/projected/6faf8566-a679-4c0f-866b-6f1b58c91769-kube-api-access-lr6lq\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.123811 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6faf8566-a679-4c0f-866b-6f1b58c91769-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.705785 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"508b0d5f-7739-45b0-be00-16110455e3e3","Type":"ContainerStarted","Data":"f85d18cef3085dad9d1340e772b023ea68845f1b2a1ec68e24ea595cd05d07c8"} Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.705847 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"508b0d5f-7739-45b0-be00-16110455e3e3","Type":"ContainerStarted","Data":"beda907434d5ed6bdecf5562621f0652898131cf308da2a5fae8a5d00d05923a"} Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.708266 4735 generic.go:334] "Generic (PLEG): container finished" podID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerID="bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974" exitCode=0 Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.708300 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfnd2" event={"ID":"6faf8566-a679-4c0f-866b-6f1b58c91769","Type":"ContainerDied","Data":"bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974"} Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.708300 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tfnd2" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.708321 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tfnd2" event={"ID":"6faf8566-a679-4c0f-866b-6f1b58c91769","Type":"ContainerDied","Data":"4614eb40caf3cd529894337b50381e56a48bebbe447989488814f53fd2e2f748"} Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.708366 4735 scope.go:117] "RemoveContainer" containerID="bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.718029 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.718018805 podStartE2EDuration="1.718018805s" podCreationTimestamp="2025-12-09 15:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:01:49.716567926 +0000 UTC m=+188.641406554" watchObservedRunningTime="2025-12-09 15:01:49.718018805 +0000 UTC m=+188.642857433" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.720653 4735 scope.go:117] "RemoveContainer" containerID="4ac834a91a96ee4b8df3a5756c993a2e182f1eea11d8d84cca0f27d65b0ab891" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.724708 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tfnd2"] Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.729055 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tfnd2"] Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.732413 4735 scope.go:117] "RemoveContainer" containerID="66714bd716eabed09655312e7538e60d8c24b82d4a96aef5c89906a05b5cebf8" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.743433 4735 scope.go:117] "RemoveContainer" containerID="bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974" Dec 09 15:01:49 crc kubenswrapper[4735]: E1209 15:01:49.743710 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974\": container with ID starting with bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974 not found: ID does not exist" containerID="bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.743740 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974"} err="failed to get container status \"bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974\": rpc error: code = NotFound desc = could not find container \"bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974\": container with ID starting with bac7d30b275ad84ac2774f7d324095fe55316b41b67ad5e9ab63f9c8e9477974 not found: ID does not exist" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.743761 4735 scope.go:117] "RemoveContainer" containerID="4ac834a91a96ee4b8df3a5756c993a2e182f1eea11d8d84cca0f27d65b0ab891" Dec 09 15:01:49 crc kubenswrapper[4735]: E1209 15:01:49.744057 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac834a91a96ee4b8df3a5756c993a2e182f1eea11d8d84cca0f27d65b0ab891\": container with ID starting with 4ac834a91a96ee4b8df3a5756c993a2e182f1eea11d8d84cca0f27d65b0ab891 not found: ID does not exist" containerID="4ac834a91a96ee4b8df3a5756c993a2e182f1eea11d8d84cca0f27d65b0ab891" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.744147 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac834a91a96ee4b8df3a5756c993a2e182f1eea11d8d84cca0f27d65b0ab891"} err="failed to get container status \"4ac834a91a96ee4b8df3a5756c993a2e182f1eea11d8d84cca0f27d65b0ab891\": rpc error: code = NotFound desc = could not find container \"4ac834a91a96ee4b8df3a5756c993a2e182f1eea11d8d84cca0f27d65b0ab891\": container with ID starting with 4ac834a91a96ee4b8df3a5756c993a2e182f1eea11d8d84cca0f27d65b0ab891 not found: ID does not exist" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.744220 4735 scope.go:117] "RemoveContainer" containerID="66714bd716eabed09655312e7538e60d8c24b82d4a96aef5c89906a05b5cebf8" Dec 09 15:01:49 crc kubenswrapper[4735]: E1209 15:01:49.744673 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66714bd716eabed09655312e7538e60d8c24b82d4a96aef5c89906a05b5cebf8\": container with ID starting with 66714bd716eabed09655312e7538e60d8c24b82d4a96aef5c89906a05b5cebf8 not found: ID does not exist" containerID="66714bd716eabed09655312e7538e60d8c24b82d4a96aef5c89906a05b5cebf8" Dec 09 15:01:49 crc kubenswrapper[4735]: I1209 15:01:49.744703 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66714bd716eabed09655312e7538e60d8c24b82d4a96aef5c89906a05b5cebf8"} err="failed to get container status \"66714bd716eabed09655312e7538e60d8c24b82d4a96aef5c89906a05b5cebf8\": rpc error: code = NotFound desc = could not find container \"66714bd716eabed09655312e7538e60d8c24b82d4a96aef5c89906a05b5cebf8\": container with ID starting with 66714bd716eabed09655312e7538e60d8c24b82d4a96aef5c89906a05b5cebf8 not found: ID does not exist" Dec 09 15:01:51 crc kubenswrapper[4735]: I1209 15:01:51.418723 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6faf8566-a679-4c0f-866b-6f1b58c91769" path="/var/lib/kubelet/pods/6faf8566-a679-4c0f-866b-6f1b58c91769/volumes" Dec 09 15:02:04 crc kubenswrapper[4735]: I1209 15:02:04.336309 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:02:04 crc kubenswrapper[4735]: I1209 15:02:04.337581 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:02:04 crc kubenswrapper[4735]: I1209 15:02:04.337643 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 15:02:04 crc kubenswrapper[4735]: I1209 15:02:04.338132 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753"} pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:02:04 crc kubenswrapper[4735]: I1209 15:02:04.338208 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" containerID="cri-o://57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753" gracePeriod=600 Dec 09 15:02:04 crc kubenswrapper[4735]: I1209 15:02:04.785636 4735 generic.go:334] "Generic (PLEG): container finished" podID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerID="57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753" exitCode=0 Dec 09 15:02:04 crc kubenswrapper[4735]: I1209 15:02:04.785711 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerDied","Data":"57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753"} Dec 09 15:02:04 crc kubenswrapper[4735]: I1209 15:02:04.786019 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"04b7c863f3e25aee025e034071815199469e37d7bb0cc98f93e577fecf50982a"} Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.534466 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" podUID="10df02c0-bbd4-4021-acf6-311c2186ff9e" containerName="oauth-openshift" containerID="cri-o://ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853" gracePeriod=15 Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.804079 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.820013 4735 generic.go:334] "Generic (PLEG): container finished" podID="10df02c0-bbd4-4021-acf6-311c2186ff9e" containerID="ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853" exitCode=0 Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.820043 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.820683 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" event={"ID":"10df02c0-bbd4-4021-acf6-311c2186ff9e","Type":"ContainerDied","Data":"ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853"} Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.820753 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxscb" event={"ID":"10df02c0-bbd4-4021-acf6-311c2186ff9e","Type":"ContainerDied","Data":"af1539b531f871d2326ee3e01fffa5bbae2807aee438404a33d7da3d5f703da9"} Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.820774 4735 scope.go:117] "RemoveContainer" containerID="ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.822521 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-85455bb588-rhcdj"] Dec 09 15:02:10 crc kubenswrapper[4735]: E1209 15:02:10.822663 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerName="registry-server" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.822674 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerName="registry-server" Dec 09 15:02:10 crc kubenswrapper[4735]: E1209 15:02:10.822685 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerName="extract-content" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.822690 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerName="extract-content" Dec 09 15:02:10 crc kubenswrapper[4735]: E1209 15:02:10.822703 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10df02c0-bbd4-4021-acf6-311c2186ff9e" containerName="oauth-openshift" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.822708 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="10df02c0-bbd4-4021-acf6-311c2186ff9e" containerName="oauth-openshift" Dec 09 15:02:10 crc kubenswrapper[4735]: E1209 15:02:10.822721 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerName="extract-utilities" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.822727 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerName="extract-utilities" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.822810 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="10df02c0-bbd4-4021-acf6-311c2186ff9e" containerName="oauth-openshift" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.822819 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6faf8566-a679-4c0f-866b-6f1b58c91769" containerName="registry-server" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.823106 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.832614 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85455bb588-rhcdj"] Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.846720 4735 scope.go:117] "RemoveContainer" containerID="ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853" Dec 09 15:02:10 crc kubenswrapper[4735]: E1209 15:02:10.847205 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853\": container with ID starting with ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853 not found: ID does not exist" containerID="ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.847268 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853"} err="failed to get container status \"ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853\": rpc error: code = NotFound desc = could not find container \"ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853\": container with ID starting with ad1afa44ddde327d4e753455dda6d9f60648b6769675f154b2ee94094e55d853 not found: ID does not exist" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919263 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-serving-cert\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919316 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-cliconfig\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919343 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-error\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919632 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsw9l\" (UniqueName: \"kubernetes.io/projected/10df02c0-bbd4-4021-acf6-311c2186ff9e-kube-api-access-fsw9l\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919693 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-provider-selection\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919717 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-idp-0-file-data\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919774 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-router-certs\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919820 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-login\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919849 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-trusted-ca-bundle\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919881 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-service-ca\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919898 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-policies\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919918 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-session\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919942 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-dir\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.919977 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-ocp-branding-template\") pod \"10df02c0-bbd4-4021-acf6-311c2186ff9e\" (UID: \"10df02c0-bbd4-4021-acf6-311c2186ff9e\") " Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920019 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920216 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920248 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-router-certs\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920266 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-template-login\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920307 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920332 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-audit-policies\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920353 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-session\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920371 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920407 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920427 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-template-error\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920470 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-service-ca\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920495 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920538 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-audit-dir\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920572 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920592 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6t99\" (UniqueName: \"kubernetes.io/projected/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-kube-api-access-s6t99\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920622 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920665 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920674 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.920799 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.921324 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.924305 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10df02c0-bbd4-4021-acf6-311c2186ff9e-kube-api-access-fsw9l" (OuterVolumeSpecName: "kube-api-access-fsw9l") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "kube-api-access-fsw9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.924419 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.924683 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.924904 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.925052 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.925137 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.925255 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.925392 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:02:10 crc kubenswrapper[4735]: I1209 15:02:10.925477 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "10df02c0-bbd4-4021-acf6-311c2186ff9e" (UID: "10df02c0-bbd4-4021-acf6-311c2186ff9e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.020884 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6t99\" (UniqueName: \"kubernetes.io/projected/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-kube-api-access-s6t99\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.020935 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.020952 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-router-certs\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.020967 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-template-login\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.020986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021007 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-audit-policies\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021022 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-session\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021038 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021059 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021075 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-template-error\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021098 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-service-ca\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021115 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021140 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-audit-dir\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021160 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021191 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021203 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021212 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021222 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021231 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021239 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021246 4735 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10df02c0-bbd4-4021-acf6-311c2186ff9e-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021254 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021262 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021270 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021279 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsw9l\" (UniqueName: \"kubernetes.io/projected/10df02c0-bbd4-4021-acf6-311c2186ff9e-kube-api-access-fsw9l\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021288 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021296 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10df02c0-bbd4-4021-acf6-311c2186ff9e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.021801 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-audit-dir\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.022405 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.022405 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.022496 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-audit-policies\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.022934 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-service-ca\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.023633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.023703 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-template-error\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.024432 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.024604 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-session\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.024822 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.025025 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-user-template-login\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.025283 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-router-certs\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.025399 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.035068 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6t99\" (UniqueName: \"kubernetes.io/projected/8a1d38b2-2dcc-4739-a195-66a68a9c6a7d-kube-api-access-s6t99\") pod \"oauth-openshift-85455bb588-rhcdj\" (UID: \"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d\") " pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.135080 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.143579 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxscb"] Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.144769 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxscb"] Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.418348 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10df02c0-bbd4-4021-acf6-311c2186ff9e" path="/var/lib/kubelet/pods/10df02c0-bbd4-4021-acf6-311c2186ff9e/volumes" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.459905 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85455bb588-rhcdj"] Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.825021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" event={"ID":"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d","Type":"ContainerStarted","Data":"3b824510a294f6ad4601ab82cde8d306c4ac018d624d337327aee2027ce067c1"} Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.825197 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" event={"ID":"8a1d38b2-2dcc-4739-a195-66a68a9c6a7d","Type":"ContainerStarted","Data":"3ae101c3d0bd3bd5a82a44375141f0dc1b52c088bc7745135684579e0e45fe3d"} Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.825909 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.941145 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" Dec 09 15:02:11 crc kubenswrapper[4735]: I1209 15:02:11.955340 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-85455bb588-rhcdj" podStartSLOduration=26.955327369 podStartE2EDuration="26.955327369s" podCreationTimestamp="2025-12-09 15:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:02:11.839971244 +0000 UTC m=+210.764809873" watchObservedRunningTime="2025-12-09 15:02:11.955327369 +0000 UTC m=+210.880165997" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.187508 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xmc6l"] Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.188036 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xmc6l" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" containerName="registry-server" containerID="cri-o://283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe" gracePeriod=30 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.192400 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-78zr9"] Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.192609 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-78zr9" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" containerName="registry-server" containerID="cri-o://a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0" gracePeriod=30 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.199333 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxhld"] Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.199450 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" containerName="marketplace-operator" containerID="cri-o://79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9" gracePeriod=30 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.205804 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-68nhq"] Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.205960 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-68nhq" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" containerName="registry-server" containerID="cri-o://f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5" gracePeriod=30 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.212142 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4mdmq"] Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.212286 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4mdmq" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerName="registry-server" containerID="cri-o://c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413" gracePeriod=30 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.214696 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9d4nc"] Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.215244 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.223972 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9d4nc"] Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.278921 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/156f78f9-e75d-4fe3-92b0-e2c29af0728c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9d4nc\" (UID: \"156f78f9-e75d-4fe3-92b0-e2c29af0728c\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.278994 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqdjs\" (UniqueName: \"kubernetes.io/projected/156f78f9-e75d-4fe3-92b0-e2c29af0728c-kube-api-access-pqdjs\") pod \"marketplace-operator-79b997595-9d4nc\" (UID: \"156f78f9-e75d-4fe3-92b0-e2c29af0728c\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.279094 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/156f78f9-e75d-4fe3-92b0-e2c29af0728c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9d4nc\" (UID: \"156f78f9-e75d-4fe3-92b0-e2c29af0728c\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.314138 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413 is running failed: container process not found" containerID="c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.318816 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413 is running failed: container process not found" containerID="c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.319260 4735 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413 is running failed: container process not found" containerID="c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413" cmd=["grpc_health_probe","-addr=:50051"] Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.319299 4735 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-4mdmq" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerName="registry-server" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.379986 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/156f78f9-e75d-4fe3-92b0-e2c29af0728c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9d4nc\" (UID: \"156f78f9-e75d-4fe3-92b0-e2c29af0728c\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.380043 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/156f78f9-e75d-4fe3-92b0-e2c29af0728c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9d4nc\" (UID: \"156f78f9-e75d-4fe3-92b0-e2c29af0728c\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.380097 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqdjs\" (UniqueName: \"kubernetes.io/projected/156f78f9-e75d-4fe3-92b0-e2c29af0728c-kube-api-access-pqdjs\") pod \"marketplace-operator-79b997595-9d4nc\" (UID: \"156f78f9-e75d-4fe3-92b0-e2c29af0728c\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.381089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/156f78f9-e75d-4fe3-92b0-e2c29af0728c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9d4nc\" (UID: \"156f78f9-e75d-4fe3-92b0-e2c29af0728c\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.386143 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/156f78f9-e75d-4fe3-92b0-e2c29af0728c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9d4nc\" (UID: \"156f78f9-e75d-4fe3-92b0-e2c29af0728c\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.393538 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqdjs\" (UniqueName: \"kubernetes.io/projected/156f78f9-e75d-4fe3-92b0-e2c29af0728c-kube-api-access-pqdjs\") pod \"marketplace-operator-79b997595-9d4nc\" (UID: \"156f78f9-e75d-4fe3-92b0-e2c29af0728c\") " pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.526826 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.534330 4735 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.534887 4735 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.534952 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.535119 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51" gracePeriod=15 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.535147 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286" gracePeriod=15 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.535375 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f" gracePeriod=15 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.535440 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e" gracePeriod=15 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.535148 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65" gracePeriod=15 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.544425 4735 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.544785 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.544803 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.545718 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.545737 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.545756 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.545763 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.545774 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.545780 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.545788 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.545796 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.545813 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.545818 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.545973 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.545983 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.545990 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.546002 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.546010 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.546019 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.546721 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.546733 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.581858 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.581908 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.581925 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.581945 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.581960 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.582000 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.582021 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.582034 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.613750 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.614430 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.614559 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.614636 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.614867 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.615193 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.615225 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.615484 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.615723 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.615987 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.617819 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.617993 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.631587 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.631897 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.632188 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.632365 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.632525 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.632681 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.632922 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.633158 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.633312 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.633499 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.633737 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.633931 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.634123 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.641334 4735 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.25.226:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.682531 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6bts\" (UniqueName: \"kubernetes.io/projected/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-kube-api-access-t6bts\") pod \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.682639 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-operator-metrics\") pod \"fc05d12f-eaa9-48ae-b280-e449caed078c\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.682713 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b85cr\" (UniqueName: \"kubernetes.io/projected/fc05d12f-eaa9-48ae-b280-e449caed078c-kube-api-access-b85cr\") pod \"fc05d12f-eaa9-48ae-b280-e449caed078c\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.682734 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-catalog-content\") pod \"bc0a0b6f-c304-4350-ad88-813e4637cad7\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.682848 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-utilities\") pod \"f273a98e-41b2-45ac-b140-8c73aaeeed54\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.682910 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-utilities\") pod \"7ea62e8b-5dc1-4527-8279-2845d3666202\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.682926 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqvgj\" (UniqueName: \"kubernetes.io/projected/f273a98e-41b2-45ac-b140-8c73aaeeed54-kube-api-access-vqvgj\") pod \"f273a98e-41b2-45ac-b140-8c73aaeeed54\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.682992 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-catalog-content\") pod \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683067 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxj5p\" (UniqueName: \"kubernetes.io/projected/bc0a0b6f-c304-4350-ad88-813e4637cad7-kube-api-access-sxj5p\") pod \"bc0a0b6f-c304-4350-ad88-813e4637cad7\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683095 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-trusted-ca\") pod \"fc05d12f-eaa9-48ae-b280-e449caed078c\" (UID: \"fc05d12f-eaa9-48ae-b280-e449caed078c\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683138 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-catalog-content\") pod \"7ea62e8b-5dc1-4527-8279-2845d3666202\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683152 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-catalog-content\") pod \"f273a98e-41b2-45ac-b140-8c73aaeeed54\" (UID: \"f273a98e-41b2-45ac-b140-8c73aaeeed54\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683181 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-utilities\") pod \"bc0a0b6f-c304-4350-ad88-813e4637cad7\" (UID: \"bc0a0b6f-c304-4350-ad88-813e4637cad7\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683232 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-utilities\") pod \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\" (UID: \"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683341 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcwbv\" (UniqueName: \"kubernetes.io/projected/7ea62e8b-5dc1-4527-8279-2845d3666202-kube-api-access-xcwbv\") pod \"7ea62e8b-5dc1-4527-8279-2845d3666202\" (UID: \"7ea62e8b-5dc1-4527-8279-2845d3666202\") " Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683548 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683589 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683681 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683711 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683782 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683815 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.683909 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.684092 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.685976 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.687229 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-utilities" (OuterVolumeSpecName: "utilities") pod "bc0a0b6f-c304-4350-ad88-813e4637cad7" (UID: "bc0a0b6f-c304-4350-ad88-813e4637cad7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.688094 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-utilities" (OuterVolumeSpecName: "utilities") pod "7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" (UID: "7ef71104-8a24-4bf0-9be4-f107d3b8f6e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.688258 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "fc05d12f-eaa9-48ae-b280-e449caed078c" (UID: "fc05d12f-eaa9-48ae-b280-e449caed078c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.688376 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.688415 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.688459 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.688483 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.688654 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.689079 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "fc05d12f-eaa9-48ae-b280-e449caed078c" (UID: "fc05d12f-eaa9-48ae-b280-e449caed078c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.689105 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-utilities" (OuterVolumeSpecName: "utilities") pod "7ea62e8b-5dc1-4527-8279-2845d3666202" (UID: "7ea62e8b-5dc1-4527-8279-2845d3666202"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.689410 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.691397 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0a0b6f-c304-4350-ad88-813e4637cad7-kube-api-access-sxj5p" (OuterVolumeSpecName: "kube-api-access-sxj5p") pod "bc0a0b6f-c304-4350-ad88-813e4637cad7" (UID: "bc0a0b6f-c304-4350-ad88-813e4637cad7"). InnerVolumeSpecName "kube-api-access-sxj5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.691653 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea62e8b-5dc1-4527-8279-2845d3666202-kube-api-access-xcwbv" (OuterVolumeSpecName: "kube-api-access-xcwbv") pod "7ea62e8b-5dc1-4527-8279-2845d3666202" (UID: "7ea62e8b-5dc1-4527-8279-2845d3666202"). InnerVolumeSpecName "kube-api-access-xcwbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.691930 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-utilities" (OuterVolumeSpecName: "utilities") pod "f273a98e-41b2-45ac-b140-8c73aaeeed54" (UID: "f273a98e-41b2-45ac-b140-8c73aaeeed54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.692522 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc05d12f-eaa9-48ae-b280-e449caed078c-kube-api-access-b85cr" (OuterVolumeSpecName: "kube-api-access-b85cr") pod "fc05d12f-eaa9-48ae-b280-e449caed078c" (UID: "fc05d12f-eaa9-48ae-b280-e449caed078c"). InnerVolumeSpecName "kube-api-access-b85cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.694163 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f273a98e-41b2-45ac-b140-8c73aaeeed54-kube-api-access-vqvgj" (OuterVolumeSpecName: "kube-api-access-vqvgj") pod "f273a98e-41b2-45ac-b140-8c73aaeeed54" (UID: "f273a98e-41b2-45ac-b140-8c73aaeeed54"). InnerVolumeSpecName "kube-api-access-vqvgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.694487 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-kube-api-access-t6bts" (OuterVolumeSpecName: "kube-api-access-t6bts") pod "7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" (UID: "7ef71104-8a24-4bf0-9be4-f107d3b8f6e8"). InnerVolumeSpecName "kube-api-access-t6bts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.709710 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f273a98e-41b2-45ac-b140-8c73aaeeed54" (UID: "f273a98e-41b2-45ac-b140-8c73aaeeed54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.742299 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ea62e8b-5dc1-4527-8279-2845d3666202" (UID: "7ea62e8b-5dc1-4527-8279-2845d3666202"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.742710 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" (UID: "7ef71104-8a24-4bf0-9be4-f107d3b8f6e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.779500 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc0a0b6f-c304-4350-ad88-813e4637cad7" (UID: "bc0a0b6f-c304-4350-ad88-813e4637cad7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785018 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785042 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785061 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785070 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785079 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785086 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcwbv\" (UniqueName: \"kubernetes.io/projected/7ea62e8b-5dc1-4527-8279-2845d3666202-kube-api-access-xcwbv\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785095 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6bts\" (UniqueName: \"kubernetes.io/projected/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-kube-api-access-t6bts\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785102 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc05d12f-eaa9-48ae-b280-e449caed078c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785112 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b85cr\" (UniqueName: \"kubernetes.io/projected/fc05d12f-eaa9-48ae-b280-e449caed078c-kube-api-access-b85cr\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785120 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc0a0b6f-c304-4350-ad88-813e4637cad7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785127 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f273a98e-41b2-45ac-b140-8c73aaeeed54-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785134 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea62e8b-5dc1-4527-8279-2845d3666202-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785141 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqvgj\" (UniqueName: \"kubernetes.io/projected/f273a98e-41b2-45ac-b140-8c73aaeeed54-kube-api-access-vqvgj\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785150 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.785158 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxj5p\" (UniqueName: \"kubernetes.io/projected/bc0a0b6f-c304-4350-ad88-813e4637cad7-kube-api-access-sxj5p\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.884624 4735 generic.go:334] "Generic (PLEG): container finished" podID="fc05d12f-eaa9-48ae-b280-e449caed078c" containerID="79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9" exitCode=0 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.884697 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.884688 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" event={"ID":"fc05d12f-eaa9-48ae-b280-e449caed078c","Type":"ContainerDied","Data":"79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9"} Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.884936 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" event={"ID":"fc05d12f-eaa9-48ae-b280-e449caed078c","Type":"ContainerDied","Data":"5eb2457ec378bdcab85caa6e93b6491a04544ba4bd16a63c4358516e66caaef8"} Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.884966 4735 scope.go:117] "RemoveContainer" containerID="79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.885393 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.885686 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.886122 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.886458 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.886720 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.886929 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.887015 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ea62e8b-5dc1-4527-8279-2845d3666202" containerID="283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe" exitCode=0 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.887058 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmc6l" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.887086 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmc6l" event={"ID":"7ea62e8b-5dc1-4527-8279-2845d3666202","Type":"ContainerDied","Data":"283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe"} Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.887112 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmc6l" event={"ID":"7ea62e8b-5dc1-4527-8279-2845d3666202","Type":"ContainerDied","Data":"5a9d48169f7184f3bd6cc8d23abb19afe436c4e0f6aa017050feb4dc288feca1"} Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.887600 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.887830 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.888086 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.888328 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.888614 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.888811 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.889427 4735 generic.go:334] "Generic (PLEG): container finished" podID="508b0d5f-7739-45b0-be00-16110455e3e3" containerID="f85d18cef3085dad9d1340e772b023ea68845f1b2a1ec68e24ea595cd05d07c8" exitCode=0 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.889476 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"508b0d5f-7739-45b0-be00-16110455e3e3","Type":"ContainerDied","Data":"f85d18cef3085dad9d1340e772b023ea68845f1b2a1ec68e24ea595cd05d07c8"} Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.889849 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.890124 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.890450 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.890755 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.891027 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.891210 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.891481 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.891967 4735 generic.go:334] "Generic (PLEG): container finished" podID="f273a98e-41b2-45ac-b140-8c73aaeeed54" containerID="f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5" exitCode=0 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.892005 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-68nhq" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.892005 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68nhq" event={"ID":"f273a98e-41b2-45ac-b140-8c73aaeeed54","Type":"ContainerDied","Data":"f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5"} Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.892114 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-68nhq" event={"ID":"f273a98e-41b2-45ac-b140-8c73aaeeed54","Type":"ContainerDied","Data":"f319e668b4d3ad416e1720d5be5c087dc99febdf1e4abc355e92749f6a1e261c"} Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.892403 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.892602 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.892787 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.893018 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.893283 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.893582 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.893833 4735 generic.go:334] "Generic (PLEG): container finished" podID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" containerID="a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0" exitCode=0 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.893855 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78zr9" event={"ID":"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8","Type":"ContainerDied","Data":"a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0"} Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.893876 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78zr9" event={"ID":"7ef71104-8a24-4bf0-9be4-f107d3b8f6e8","Type":"ContainerDied","Data":"9bbcd08ee3dc5c12af3c9347823fd517b0e27680866609303332c6fd5fbbf590"} Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.893805 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.893926 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78zr9" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.894429 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.894855 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.895190 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.895582 4735 scope.go:117] "RemoveContainer" containerID="79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.895823 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.896078 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.896311 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.896419 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 09 15:02:26 crc kubenswrapper[4735]: E1209 15:02:26.896264 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9\": container with ID starting with 79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9 not found: ID does not exist" containerID="79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.896676 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.897584 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9"} err="failed to get container status \"79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9\": rpc error: code = NotFound desc = could not find container \"79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9\": container with ID starting with 79af36c49b6aa2db14c6281ca4a1b24ee6a5d26a6846e41ded3e65d62a91b4d9 not found: ID does not exist" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.897697 4735 scope.go:117] "RemoveContainer" containerID="283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.898346 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.898638 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.898809 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.898983 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.899376 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.899685 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.899918 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.900447 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.900972 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286" exitCode=0 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.900988 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65" exitCode=0 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.900997 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e" exitCode=0 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.901004 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f" exitCode=2 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.901781 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.901940 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.902106 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.902300 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.902502 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.902706 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.902863 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.903249 4735 generic.go:334] "Generic (PLEG): container finished" podID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerID="c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413" exitCode=0 Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.903281 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mdmq" event={"ID":"bc0a0b6f-c304-4350-ad88-813e4637cad7","Type":"ContainerDied","Data":"c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413"} Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.903303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mdmq" event={"ID":"bc0a0b6f-c304-4350-ad88-813e4637cad7","Type":"ContainerDied","Data":"6fbf98357bca8c43bf0c1a855b0826d317bacbb8fbd206e615a0aba66a1505f3"} Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.903345 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mdmq" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.909890 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.910112 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.910320 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.910971 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.911242 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.911470 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.911673 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.911918 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.912307 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.912566 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.912760 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.912928 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.913093 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.913409 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.915940 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.916160 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.916380 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.920281 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.920484 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.920674 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.920848 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.921346 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.922499 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.922886 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.923100 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.923338 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.923599 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.923824 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.931377 4735 scope.go:117] "RemoveContainer" containerID="3e350717b311eaf2b114d130b381141277ddb7e67c88f673b7c78cfd8f16b597" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.941831 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:26 crc kubenswrapper[4735]: I1209 15:02:26.972171 4735 scope.go:117] "RemoveContainer" containerID="e55f8133aa3394b3a5ee58acd0715e211d0b85156deea49bb45bdf65da5b0d32" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.003002 4735 scope.go:117] "RemoveContainer" containerID="283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.003279 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe\": container with ID starting with 283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe not found: ID does not exist" containerID="283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.003304 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe"} err="failed to get container status \"283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe\": rpc error: code = NotFound desc = could not find container \"283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe\": container with ID starting with 283df76e5558fa5ad41e301a4aae21aca77546d2126ed90a60741510ebafcabe not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.003321 4735 scope.go:117] "RemoveContainer" containerID="3e350717b311eaf2b114d130b381141277ddb7e67c88f673b7c78cfd8f16b597" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.003586 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e350717b311eaf2b114d130b381141277ddb7e67c88f673b7c78cfd8f16b597\": container with ID starting with 3e350717b311eaf2b114d130b381141277ddb7e67c88f673b7c78cfd8f16b597 not found: ID does not exist" containerID="3e350717b311eaf2b114d130b381141277ddb7e67c88f673b7c78cfd8f16b597" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.003605 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e350717b311eaf2b114d130b381141277ddb7e67c88f673b7c78cfd8f16b597"} err="failed to get container status \"3e350717b311eaf2b114d130b381141277ddb7e67c88f673b7c78cfd8f16b597\": rpc error: code = NotFound desc = could not find container \"3e350717b311eaf2b114d130b381141277ddb7e67c88f673b7c78cfd8f16b597\": container with ID starting with 3e350717b311eaf2b114d130b381141277ddb7e67c88f673b7c78cfd8f16b597 not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.003618 4735 scope.go:117] "RemoveContainer" containerID="e55f8133aa3394b3a5ee58acd0715e211d0b85156deea49bb45bdf65da5b0d32" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.003820 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e55f8133aa3394b3a5ee58acd0715e211d0b85156deea49bb45bdf65da5b0d32\": container with ID starting with e55f8133aa3394b3a5ee58acd0715e211d0b85156deea49bb45bdf65da5b0d32 not found: ID does not exist" containerID="e55f8133aa3394b3a5ee58acd0715e211d0b85156deea49bb45bdf65da5b0d32" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.003842 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e55f8133aa3394b3a5ee58acd0715e211d0b85156deea49bb45bdf65da5b0d32"} err="failed to get container status \"e55f8133aa3394b3a5ee58acd0715e211d0b85156deea49bb45bdf65da5b0d32\": rpc error: code = NotFound desc = could not find container \"e55f8133aa3394b3a5ee58acd0715e211d0b85156deea49bb45bdf65da5b0d32\": container with ID starting with e55f8133aa3394b3a5ee58acd0715e211d0b85156deea49bb45bdf65da5b0d32 not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.003854 4735 scope.go:117] "RemoveContainer" containerID="f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.014675 4735 scope.go:117] "RemoveContainer" containerID="186aa6f3082278424285eff33e64bc2b8cb65f6c280eb5c3858a6a09c372740c" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.015596 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.25.226:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f9437f57a3aa0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 15:02:27.015064224 +0000 UTC m=+225.939902842,LastTimestamp:2025-12-09 15:02:27.015064224 +0000 UTC m=+225.939902842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.021846 4735 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 09 15:02:27 crc kubenswrapper[4735]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-9d4nc_openshift-marketplace_156f78f9-e75d-4fe3-92b0-e2c29af0728c_0(e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc): error adding pod openshift-marketplace_marketplace-operator-79b997595-9d4nc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc" Netns:"/var/run/netns/5c1e9f0c-cf50-4b96-8625-9b3f48aaae7a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-9d4nc;K8S_POD_INFRA_CONTAINER_ID=e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc;K8S_POD_UID=156f78f9-e75d-4fe3-92b0-e2c29af0728c" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-9d4nc] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-9d4nc/156f78f9-e75d-4fe3-92b0-e2c29af0728c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-9d4nc?timeout=1m0s": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 15:02:27 crc kubenswrapper[4735]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 09 15:02:27 crc kubenswrapper[4735]: > Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.021899 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 09 15:02:27 crc kubenswrapper[4735]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-9d4nc_openshift-marketplace_156f78f9-e75d-4fe3-92b0-e2c29af0728c_0(e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc): error adding pod openshift-marketplace_marketplace-operator-79b997595-9d4nc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc" Netns:"/var/run/netns/5c1e9f0c-cf50-4b96-8625-9b3f48aaae7a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-9d4nc;K8S_POD_INFRA_CONTAINER_ID=e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc;K8S_POD_UID=156f78f9-e75d-4fe3-92b0-e2c29af0728c" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-9d4nc] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-9d4nc/156f78f9-e75d-4fe3-92b0-e2c29af0728c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-9d4nc?timeout=1m0s": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 15:02:27 crc kubenswrapper[4735]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 09 15:02:27 crc kubenswrapper[4735]: > pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.021917 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 09 15:02:27 crc kubenswrapper[4735]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-9d4nc_openshift-marketplace_156f78f9-e75d-4fe3-92b0-e2c29af0728c_0(e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc): error adding pod openshift-marketplace_marketplace-operator-79b997595-9d4nc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc" Netns:"/var/run/netns/5c1e9f0c-cf50-4b96-8625-9b3f48aaae7a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-9d4nc;K8S_POD_INFRA_CONTAINER_ID=e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc;K8S_POD_UID=156f78f9-e75d-4fe3-92b0-e2c29af0728c" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-9d4nc] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-9d4nc/156f78f9-e75d-4fe3-92b0-e2c29af0728c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-9d4nc?timeout=1m0s": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 15:02:27 crc kubenswrapper[4735]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 09 15:02:27 crc kubenswrapper[4735]: > pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.021964 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"marketplace-operator-79b997595-9d4nc_openshift-marketplace(156f78f9-e75d-4fe3-92b0-e2c29af0728c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"marketplace-operator-79b997595-9d4nc_openshift-marketplace(156f78f9-e75d-4fe3-92b0-e2c29af0728c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-9d4nc_openshift-marketplace_156f78f9-e75d-4fe3-92b0-e2c29af0728c_0(e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc): error adding pod openshift-marketplace_marketplace-operator-79b997595-9d4nc to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc\\\" Netns:\\\"/var/run/netns/5c1e9f0c-cf50-4b96-8625-9b3f48aaae7a\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-9d4nc;K8S_POD_INFRA_CONTAINER_ID=e84e04c2ed6bf544873b812adfd9898ddddda2a6323b49d823361e6441936bbc;K8S_POD_UID=156f78f9-e75d-4fe3-92b0-e2c29af0728c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-9d4nc] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-9d4nc/156f78f9-e75d-4fe3-92b0-e2c29af0728c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-9d4nc?timeout=1m0s\\\": dial tcp 192.168.25.226:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" podUID="156f78f9-e75d-4fe3-92b0-e2c29af0728c" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.026792 4735 scope.go:117] "RemoveContainer" containerID="7b1956e185fc529bf2ef3dd9b7074080237edcb3a38943aebe1303b2af914fa8" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.035158 4735 scope.go:117] "RemoveContainer" containerID="f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.035374 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5\": container with ID starting with f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5 not found: ID does not exist" containerID="f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.035399 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5"} err="failed to get container status \"f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5\": rpc error: code = NotFound desc = could not find container \"f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5\": container with ID starting with f0213b8895f656dd29078f82dbe9fb8b02e6355f2a945409fb6c6f5abd7019f5 not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.035415 4735 scope.go:117] "RemoveContainer" containerID="186aa6f3082278424285eff33e64bc2b8cb65f6c280eb5c3858a6a09c372740c" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.035638 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186aa6f3082278424285eff33e64bc2b8cb65f6c280eb5c3858a6a09c372740c\": container with ID starting with 186aa6f3082278424285eff33e64bc2b8cb65f6c280eb5c3858a6a09c372740c not found: ID does not exist" containerID="186aa6f3082278424285eff33e64bc2b8cb65f6c280eb5c3858a6a09c372740c" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.035657 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186aa6f3082278424285eff33e64bc2b8cb65f6c280eb5c3858a6a09c372740c"} err="failed to get container status \"186aa6f3082278424285eff33e64bc2b8cb65f6c280eb5c3858a6a09c372740c\": rpc error: code = NotFound desc = could not find container \"186aa6f3082278424285eff33e64bc2b8cb65f6c280eb5c3858a6a09c372740c\": container with ID starting with 186aa6f3082278424285eff33e64bc2b8cb65f6c280eb5c3858a6a09c372740c not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.035671 4735 scope.go:117] "RemoveContainer" containerID="7b1956e185fc529bf2ef3dd9b7074080237edcb3a38943aebe1303b2af914fa8" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.035839 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b1956e185fc529bf2ef3dd9b7074080237edcb3a38943aebe1303b2af914fa8\": container with ID starting with 7b1956e185fc529bf2ef3dd9b7074080237edcb3a38943aebe1303b2af914fa8 not found: ID does not exist" containerID="7b1956e185fc529bf2ef3dd9b7074080237edcb3a38943aebe1303b2af914fa8" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.035862 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b1956e185fc529bf2ef3dd9b7074080237edcb3a38943aebe1303b2af914fa8"} err="failed to get container status \"7b1956e185fc529bf2ef3dd9b7074080237edcb3a38943aebe1303b2af914fa8\": rpc error: code = NotFound desc = could not find container \"7b1956e185fc529bf2ef3dd9b7074080237edcb3a38943aebe1303b2af914fa8\": container with ID starting with 7b1956e185fc529bf2ef3dd9b7074080237edcb3a38943aebe1303b2af914fa8 not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.035875 4735 scope.go:117] "RemoveContainer" containerID="a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.044575 4735 scope.go:117] "RemoveContainer" containerID="e4b2f737eee8a69ab907accb97cd102126a21c45ebf6b64d6ed21b6f390fde44" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.055153 4735 scope.go:117] "RemoveContainer" containerID="28774878f109c140231839d5f1b76c8cb243f35e73561b219002f19d366c8a2c" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.065642 4735 scope.go:117] "RemoveContainer" containerID="a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.065893 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0\": container with ID starting with a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0 not found: ID does not exist" containerID="a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.065919 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0"} err="failed to get container status \"a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0\": rpc error: code = NotFound desc = could not find container \"a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0\": container with ID starting with a16cd98d2f02c5a738f520c920f5fb2ef1ae33de7602cde356826259b288c0a0 not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.065935 4735 scope.go:117] "RemoveContainer" containerID="e4b2f737eee8a69ab907accb97cd102126a21c45ebf6b64d6ed21b6f390fde44" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.066149 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b2f737eee8a69ab907accb97cd102126a21c45ebf6b64d6ed21b6f390fde44\": container with ID starting with e4b2f737eee8a69ab907accb97cd102126a21c45ebf6b64d6ed21b6f390fde44 not found: ID does not exist" containerID="e4b2f737eee8a69ab907accb97cd102126a21c45ebf6b64d6ed21b6f390fde44" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.066179 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b2f737eee8a69ab907accb97cd102126a21c45ebf6b64d6ed21b6f390fde44"} err="failed to get container status \"e4b2f737eee8a69ab907accb97cd102126a21c45ebf6b64d6ed21b6f390fde44\": rpc error: code = NotFound desc = could not find container \"e4b2f737eee8a69ab907accb97cd102126a21c45ebf6b64d6ed21b6f390fde44\": container with ID starting with e4b2f737eee8a69ab907accb97cd102126a21c45ebf6b64d6ed21b6f390fde44 not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.066196 4735 scope.go:117] "RemoveContainer" containerID="28774878f109c140231839d5f1b76c8cb243f35e73561b219002f19d366c8a2c" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.066524 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28774878f109c140231839d5f1b76c8cb243f35e73561b219002f19d366c8a2c\": container with ID starting with 28774878f109c140231839d5f1b76c8cb243f35e73561b219002f19d366c8a2c not found: ID does not exist" containerID="28774878f109c140231839d5f1b76c8cb243f35e73561b219002f19d366c8a2c" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.066545 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28774878f109c140231839d5f1b76c8cb243f35e73561b219002f19d366c8a2c"} err="failed to get container status \"28774878f109c140231839d5f1b76c8cb243f35e73561b219002f19d366c8a2c\": rpc error: code = NotFound desc = could not find container \"28774878f109c140231839d5f1b76c8cb243f35e73561b219002f19d366c8a2c\": container with ID starting with 28774878f109c140231839d5f1b76c8cb243f35e73561b219002f19d366c8a2c not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.066558 4735 scope.go:117] "RemoveContainer" containerID="9170e178e4ab1ac8a7d022495238ee0c49db33c800a3578ded561df0f70a947a" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.086446 4735 scope.go:117] "RemoveContainer" containerID="c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.096076 4735 scope.go:117] "RemoveContainer" containerID="a041172b7eb4483375205a6d8d522f9fcc0f61d543260684fdde3cd324db0f58" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.109608 4735 scope.go:117] "RemoveContainer" containerID="1f2cdc80db3f365ff109b959e8ca3100183e528f37ba884a354a68ed6e9192b2" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.120545 4735 scope.go:117] "RemoveContainer" containerID="c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.120826 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413\": container with ID starting with c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413 not found: ID does not exist" containerID="c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.120853 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413"} err="failed to get container status \"c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413\": rpc error: code = NotFound desc = could not find container \"c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413\": container with ID starting with c647aa1d638c0fa00366cd181b5c6dc805dd62f8eceaf49bb85615ec3e3fe413 not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.120868 4735 scope.go:117] "RemoveContainer" containerID="a041172b7eb4483375205a6d8d522f9fcc0f61d543260684fdde3cd324db0f58" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.121103 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a041172b7eb4483375205a6d8d522f9fcc0f61d543260684fdde3cd324db0f58\": container with ID starting with a041172b7eb4483375205a6d8d522f9fcc0f61d543260684fdde3cd324db0f58 not found: ID does not exist" containerID="a041172b7eb4483375205a6d8d522f9fcc0f61d543260684fdde3cd324db0f58" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.121124 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a041172b7eb4483375205a6d8d522f9fcc0f61d543260684fdde3cd324db0f58"} err="failed to get container status \"a041172b7eb4483375205a6d8d522f9fcc0f61d543260684fdde3cd324db0f58\": rpc error: code = NotFound desc = could not find container \"a041172b7eb4483375205a6d8d522f9fcc0f61d543260684fdde3cd324db0f58\": container with ID starting with a041172b7eb4483375205a6d8d522f9fcc0f61d543260684fdde3cd324db0f58 not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.121139 4735 scope.go:117] "RemoveContainer" containerID="1f2cdc80db3f365ff109b959e8ca3100183e528f37ba884a354a68ed6e9192b2" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.121423 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f2cdc80db3f365ff109b959e8ca3100183e528f37ba884a354a68ed6e9192b2\": container with ID starting with 1f2cdc80db3f365ff109b959e8ca3100183e528f37ba884a354a68ed6e9192b2 not found: ID does not exist" containerID="1f2cdc80db3f365ff109b959e8ca3100183e528f37ba884a354a68ed6e9192b2" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.121454 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f2cdc80db3f365ff109b959e8ca3100183e528f37ba884a354a68ed6e9192b2"} err="failed to get container status \"1f2cdc80db3f365ff109b959e8ca3100183e528f37ba884a354a68ed6e9192b2\": rpc error: code = NotFound desc = could not find container \"1f2cdc80db3f365ff109b959e8ca3100183e528f37ba884a354a68ed6e9192b2\": container with ID starting with 1f2cdc80db3f365ff109b959e8ca3100183e528f37ba884a354a68ed6e9192b2 not found: ID does not exist" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.910887 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.912596 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a7ab2338126747d7bfd69f1ac870d3d92601ff836c14b976d1d0b1a497df04a4"} Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.912630 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a446d8eb805b1e60039d0d85d93ec1de7eab532cb3b83faa25efb412f8545d9b"} Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.913062 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:27 crc kubenswrapper[4735]: E1209 15:02:27.913108 4735 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.25.226:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.913272 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.913466 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.913689 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.913866 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.914065 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.916197 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:27 crc kubenswrapper[4735]: I1209 15:02:27.916411 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.084186 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.084790 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.085127 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.085365 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.085604 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.085801 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.086003 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.095438 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-kubelet-dir\") pod \"508b0d5f-7739-45b0-be00-16110455e3e3\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.095474 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/508b0d5f-7739-45b0-be00-16110455e3e3-kube-api-access\") pod \"508b0d5f-7739-45b0-be00-16110455e3e3\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.095559 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-var-lock\") pod \"508b0d5f-7739-45b0-be00-16110455e3e3\" (UID: \"508b0d5f-7739-45b0-be00-16110455e3e3\") " Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.095555 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "508b0d5f-7739-45b0-be00-16110455e3e3" (UID: "508b0d5f-7739-45b0-be00-16110455e3e3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.095724 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.095755 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-var-lock" (OuterVolumeSpecName: "var-lock") pod "508b0d5f-7739-45b0-be00-16110455e3e3" (UID: "508b0d5f-7739-45b0-be00-16110455e3e3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.099833 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/508b0d5f-7739-45b0-be00-16110455e3e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "508b0d5f-7739-45b0-be00-16110455e3e3" (UID: "508b0d5f-7739-45b0-be00-16110455e3e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.196476 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/508b0d5f-7739-45b0-be00-16110455e3e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.196497 4735 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/508b0d5f-7739-45b0-be00-16110455e3e3-var-lock\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:28 crc kubenswrapper[4735]: E1209 15:02:28.371935 4735 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 09 15:02:28 crc kubenswrapper[4735]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-9d4nc_openshift-marketplace_156f78f9-e75d-4fe3-92b0-e2c29af0728c_0(a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f): error adding pod openshift-marketplace_marketplace-operator-79b997595-9d4nc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f" Netns:"/var/run/netns/80844bfa-9f5e-4f4f-a933-cea5983531fe" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-9d4nc;K8S_POD_INFRA_CONTAINER_ID=a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f;K8S_POD_UID=156f78f9-e75d-4fe3-92b0-e2c29af0728c" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-9d4nc] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-9d4nc/156f78f9-e75d-4fe3-92b0-e2c29af0728c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-9d4nc?timeout=1m0s": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 15:02:28 crc kubenswrapper[4735]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 09 15:02:28 crc kubenswrapper[4735]: > Dec 09 15:02:28 crc kubenswrapper[4735]: E1209 15:02:28.371990 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 09 15:02:28 crc kubenswrapper[4735]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-9d4nc_openshift-marketplace_156f78f9-e75d-4fe3-92b0-e2c29af0728c_0(a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f): error adding pod openshift-marketplace_marketplace-operator-79b997595-9d4nc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f" Netns:"/var/run/netns/80844bfa-9f5e-4f4f-a933-cea5983531fe" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-9d4nc;K8S_POD_INFRA_CONTAINER_ID=a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f;K8S_POD_UID=156f78f9-e75d-4fe3-92b0-e2c29af0728c" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-9d4nc] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-9d4nc/156f78f9-e75d-4fe3-92b0-e2c29af0728c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-9d4nc?timeout=1m0s": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 15:02:28 crc kubenswrapper[4735]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 09 15:02:28 crc kubenswrapper[4735]: > pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:28 crc kubenswrapper[4735]: E1209 15:02:28.372007 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 09 15:02:28 crc kubenswrapper[4735]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-9d4nc_openshift-marketplace_156f78f9-e75d-4fe3-92b0-e2c29af0728c_0(a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f): error adding pod openshift-marketplace_marketplace-operator-79b997595-9d4nc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f" Netns:"/var/run/netns/80844bfa-9f5e-4f4f-a933-cea5983531fe" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-9d4nc;K8S_POD_INFRA_CONTAINER_ID=a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f;K8S_POD_UID=156f78f9-e75d-4fe3-92b0-e2c29af0728c" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-9d4nc] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-9d4nc/156f78f9-e75d-4fe3-92b0-e2c29af0728c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-9d4nc?timeout=1m0s": dial tcp 192.168.25.226:6443: connect: connection refused Dec 09 15:02:28 crc kubenswrapper[4735]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 09 15:02:28 crc kubenswrapper[4735]: > pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:28 crc kubenswrapper[4735]: E1209 15:02:28.372145 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"marketplace-operator-79b997595-9d4nc_openshift-marketplace(156f78f9-e75d-4fe3-92b0-e2c29af0728c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"marketplace-operator-79b997595-9d4nc_openshift-marketplace(156f78f9-e75d-4fe3-92b0-e2c29af0728c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-9d4nc_openshift-marketplace_156f78f9-e75d-4fe3-92b0-e2c29af0728c_0(a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f): error adding pod openshift-marketplace_marketplace-operator-79b997595-9d4nc to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f\\\" Netns:\\\"/var/run/netns/80844bfa-9f5e-4f4f-a933-cea5983531fe\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-9d4nc;K8S_POD_INFRA_CONTAINER_ID=a5acc67ccb20d7c94eb2dcaf31ffc6352d4d4f6e5daff6f1422c3ff97fb7d88f;K8S_POD_UID=156f78f9-e75d-4fe3-92b0-e2c29af0728c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-9d4nc] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-9d4nc/156f78f9-e75d-4fe3-92b0-e2c29af0728c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: SetNetworkStatus: failed to update the pod marketplace-operator-79b997595-9d4nc in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-9d4nc?timeout=1m0s\\\": dial tcp 192.168.25.226:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" podUID="156f78f9-e75d-4fe3-92b0-e2c29af0728c" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.852816 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.853851 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.854261 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.854621 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.854949 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.855184 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.855409 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.855619 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.855828 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.904687 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.904767 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.904815 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.904801 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.904856 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.904924 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.905113 4735 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.905129 4735 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.905139 4735 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.920880 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"508b0d5f-7739-45b0-be00-16110455e3e3","Type":"ContainerDied","Data":"beda907434d5ed6bdecf5562621f0652898131cf308da2a5fae8a5d00d05923a"} Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.920908 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beda907434d5ed6bdecf5562621f0652898131cf308da2a5fae8a5d00d05923a" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.920888 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.923447 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.924030 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51" exitCode=0 Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.924085 4735 scope.go:117] "RemoveContainer" containerID="8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.924091 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:28 crc kubenswrapper[4735]: E1209 15:02:28.924487 4735 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.25.226:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.934802 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.935182 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.935533 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.935908 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.936274 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.936447 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.936676 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.939671 4735 scope.go:117] "RemoveContainer" containerID="825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.940665 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.940842 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.941056 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.941292 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.941573 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.941785 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.941975 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.949967 4735 scope.go:117] "RemoveContainer" containerID="1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.958508 4735 scope.go:117] "RemoveContainer" containerID="db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.968045 4735 scope.go:117] "RemoveContainer" containerID="ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.979081 4735 scope.go:117] "RemoveContainer" containerID="b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.995180 4735 scope.go:117] "RemoveContainer" containerID="8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286" Dec 09 15:02:28 crc kubenswrapper[4735]: E1209 15:02:28.995457 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\": container with ID starting with 8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286 not found: ID does not exist" containerID="8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.995485 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286"} err="failed to get container status \"8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\": rpc error: code = NotFound desc = could not find container \"8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286\": container with ID starting with 8fd57928b7150ebf43c8966442de48e056664ab5d70dad2e3813cab35f2f5286 not found: ID does not exist" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.995505 4735 scope.go:117] "RemoveContainer" containerID="825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65" Dec 09 15:02:28 crc kubenswrapper[4735]: E1209 15:02:28.996453 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\": container with ID starting with 825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65 not found: ID does not exist" containerID="825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.996485 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65"} err="failed to get container status \"825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\": rpc error: code = NotFound desc = could not find container \"825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65\": container with ID starting with 825a789d390f502fe68101b68d1091e725fe919ee013523d2917cfa5a7081b65 not found: ID does not exist" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.996504 4735 scope.go:117] "RemoveContainer" containerID="1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e" Dec 09 15:02:28 crc kubenswrapper[4735]: E1209 15:02:28.996782 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\": container with ID starting with 1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e not found: ID does not exist" containerID="1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.996807 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e"} err="failed to get container status \"1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\": rpc error: code = NotFound desc = could not find container \"1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e\": container with ID starting with 1f0627ab704d18bc3d0357fd29fff439d76cbc205e942e95525fbf7c3d73e26e not found: ID does not exist" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.996821 4735 scope.go:117] "RemoveContainer" containerID="db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f" Dec 09 15:02:28 crc kubenswrapper[4735]: E1209 15:02:28.997028 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\": container with ID starting with db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f not found: ID does not exist" containerID="db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.997062 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f"} err="failed to get container status \"db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\": rpc error: code = NotFound desc = could not find container \"db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f\": container with ID starting with db5f0274720cc8671bd87e3be451c402feb82607386b06ac9e0af4882cde481f not found: ID does not exist" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.997080 4735 scope.go:117] "RemoveContainer" containerID="ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51" Dec 09 15:02:28 crc kubenswrapper[4735]: E1209 15:02:28.997304 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\": container with ID starting with ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51 not found: ID does not exist" containerID="ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.997328 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51"} err="failed to get container status \"ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\": rpc error: code = NotFound desc = could not find container \"ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51\": container with ID starting with ec3fc8a79a5e0b30a857e47ac708816bc3ef12ee5daea1de1253d83023b7ef51 not found: ID does not exist" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.997341 4735 scope.go:117] "RemoveContainer" containerID="b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218" Dec 09 15:02:28 crc kubenswrapper[4735]: E1209 15:02:28.997683 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\": container with ID starting with b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218 not found: ID does not exist" containerID="b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218" Dec 09 15:02:28 crc kubenswrapper[4735]: I1209 15:02:28.997707 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218"} err="failed to get container status \"b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\": rpc error: code = NotFound desc = could not find container \"b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218\": container with ID starting with b4421c9ca97e49f10167c1881d98d21becafaecab80af275430b1703185fe218 not found: ID does not exist" Dec 09 15:02:29 crc kubenswrapper[4735]: I1209 15:02:29.419363 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 09 15:02:30 crc kubenswrapper[4735]: E1209 15:02:30.513001 4735 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 192.168.25.226:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" volumeName="registry-storage" Dec 09 15:02:30 crc kubenswrapper[4735]: E1209 15:02:30.640842 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.25.226:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f9437f57a3aa0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-09 15:02:27.015064224 +0000 UTC m=+225.939902842,LastTimestamp:2025-12-09 15:02:27.015064224 +0000 UTC m=+225.939902842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 09 15:02:31 crc kubenswrapper[4735]: I1209 15:02:31.415179 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:31 crc kubenswrapper[4735]: I1209 15:02:31.415703 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:31 crc kubenswrapper[4735]: I1209 15:02:31.415948 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:31 crc kubenswrapper[4735]: I1209 15:02:31.416222 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:31 crc kubenswrapper[4735]: I1209 15:02:31.416456 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:31 crc kubenswrapper[4735]: I1209 15:02:31.416707 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:35 crc kubenswrapper[4735]: E1209 15:02:35.573555 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:35 crc kubenswrapper[4735]: E1209 15:02:35.574049 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:35 crc kubenswrapper[4735]: E1209 15:02:35.574327 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:35 crc kubenswrapper[4735]: E1209 15:02:35.574570 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:35 crc kubenswrapper[4735]: E1209 15:02:35.574779 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:35 crc kubenswrapper[4735]: I1209 15:02:35.574809 4735 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 09 15:02:35 crc kubenswrapper[4735]: E1209 15:02:35.575026 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" interval="200ms" Dec 09 15:02:35 crc kubenswrapper[4735]: E1209 15:02:35.776361 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" interval="400ms" Dec 09 15:02:36 crc kubenswrapper[4735]: E1209 15:02:36.177682 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" interval="800ms" Dec 09 15:02:36 crc kubenswrapper[4735]: E1209 15:02:36.978550 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.25.226:6443: connect: connection refused" interval="1.6s" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.413347 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.413903 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.414190 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.414476 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.414689 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.414928 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.415188 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.424121 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3857fdec-9ac6-41b1-9504-8c256da10835" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.424144 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3857fdec-9ac6-41b1-9504-8c256da10835" Dec 09 15:02:37 crc kubenswrapper[4735]: E1209 15:02:37.424374 4735 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.424699 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.962413 4735 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="5741df7d76f0b171fe2859e9280367e929b0e1b46992dc4ba719b39a0bee391f" exitCode=0 Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.962503 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"5741df7d76f0b171fe2859e9280367e929b0e1b46992dc4ba719b39a0bee391f"} Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.962623 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1153351dd98b0de6f2f7bce6b86b5d5aca323dfb561d0313200c424b7276aa3c"} Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.962839 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3857fdec-9ac6-41b1-9504-8c256da10835" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.962855 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3857fdec-9ac6-41b1-9504-8c256da10835" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.963098 4735 status_manager.go:851] "Failed to get status for pod" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:37 crc kubenswrapper[4735]: E1209 15:02:37.963107 4735 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 192.168.25.226:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.963356 4735 status_manager.go:851] "Failed to get status for pod" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" pod="openshift-marketplace/marketplace-operator-79b997595-rxhld" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-rxhld\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.963604 4735 status_manager.go:851] "Failed to get status for pod" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" pod="openshift-marketplace/redhat-marketplace-68nhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-68nhq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.963828 4735 status_manager.go:851] "Failed to get status for pod" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" pod="openshift-marketplace/redhat-operators-4mdmq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4mdmq\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.964161 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" pod="openshift-marketplace/community-operators-78zr9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-78zr9\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:37 crc kubenswrapper[4735]: I1209 15:02:37.964354 4735 status_manager.go:851] "Failed to get status for pod" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" pod="openshift-marketplace/certified-operators-xmc6l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-xmc6l\": dial tcp 192.168.25.226:6443: connect: connection refused" Dec 09 15:02:38 crc kubenswrapper[4735]: I1209 15:02:38.968876 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4ca36011ac935e47195fe3964c2f4b1b7bb3d0096ccc15aa6937dbd74f8f00e8"} Dec 09 15:02:38 crc kubenswrapper[4735]: I1209 15:02:38.969115 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9a02fb76719dc99b029187a9113e6ed8af160d0c816c58f1641239d41e42c05c"} Dec 09 15:02:38 crc kubenswrapper[4735]: I1209 15:02:38.969126 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fde6be1084d682526b2e42eac3a0a35ec5954b89ca0c7d3a1f74951af6744aea"} Dec 09 15:02:38 crc kubenswrapper[4735]: I1209 15:02:38.969135 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"018db4df892245cb8d0dbfbc0beccb11912839ae2d4158b285ec464f9b5a541c"} Dec 09 15:02:38 crc kubenswrapper[4735]: I1209 15:02:38.969142 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7435cdc156190a26a47d77d4b1ca018cfcd207e8e68b77edfc7d3dad8b942636"} Dec 09 15:02:38 crc kubenswrapper[4735]: I1209 15:02:38.969353 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3857fdec-9ac6-41b1-9504-8c256da10835" Dec 09 15:02:38 crc kubenswrapper[4735]: I1209 15:02:38.969365 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3857fdec-9ac6-41b1-9504-8c256da10835" Dec 09 15:02:38 crc kubenswrapper[4735]: I1209 15:02:38.969570 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:39 crc kubenswrapper[4735]: I1209 15:02:39.975274 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 09 15:02:39 crc kubenswrapper[4735]: I1209 15:02:39.975320 4735 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4" exitCode=1 Dec 09 15:02:39 crc kubenswrapper[4735]: I1209 15:02:39.975345 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4"} Dec 09 15:02:39 crc kubenswrapper[4735]: I1209 15:02:39.975721 4735 scope.go:117] "RemoveContainer" containerID="842c5dea37a777b1e1643b6e3ba1cf2c57797aa3cd6a7b63db9c5129b91c1df4" Dec 09 15:02:40 crc kubenswrapper[4735]: I1209 15:02:40.699565 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 15:02:40 crc kubenswrapper[4735]: I1209 15:02:40.982998 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 09 15:02:40 crc kubenswrapper[4735]: I1209 15:02:40.983049 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b44983f66195d3e594ac662fce07864aab462ffd023614009444ee899efadbab"} Dec 09 15:02:42 crc kubenswrapper[4735]: I1209 15:02:42.425466 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:42 crc kubenswrapper[4735]: I1209 15:02:42.425507 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:42 crc kubenswrapper[4735]: I1209 15:02:42.428809 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:43 crc kubenswrapper[4735]: I1209 15:02:43.413561 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:43 crc kubenswrapper[4735]: I1209 15:02:43.413991 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:43 crc kubenswrapper[4735]: W1209 15:02:43.746438 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod156f78f9_e75d_4fe3_92b0_e2c29af0728c.slice/crio-8fe473c46135eb10513d7f5b83daba8f0e362eeec08b470adbaa8cf41e0a926c WatchSource:0}: Error finding container 8fe473c46135eb10513d7f5b83daba8f0e362eeec08b470adbaa8cf41e0a926c: Status 404 returned error can't find the container with id 8fe473c46135eb10513d7f5b83daba8f0e362eeec08b470adbaa8cf41e0a926c Dec 09 15:02:43 crc kubenswrapper[4735]: I1209 15:02:43.994855 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9d4nc_156f78f9-e75d-4fe3-92b0-e2c29af0728c/marketplace-operator/0.log" Dec 09 15:02:43 crc kubenswrapper[4735]: I1209 15:02:43.994896 4735 generic.go:334] "Generic (PLEG): container finished" podID="156f78f9-e75d-4fe3-92b0-e2c29af0728c" containerID="0043e5cf7f63d7f50f6d755fce0445e45b7b241196d07bfdf4ab541569b3aa00" exitCode=1 Dec 09 15:02:43 crc kubenswrapper[4735]: I1209 15:02:43.994921 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" event={"ID":"156f78f9-e75d-4fe3-92b0-e2c29af0728c","Type":"ContainerDied","Data":"0043e5cf7f63d7f50f6d755fce0445e45b7b241196d07bfdf4ab541569b3aa00"} Dec 09 15:02:43 crc kubenswrapper[4735]: I1209 15:02:43.994958 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" event={"ID":"156f78f9-e75d-4fe3-92b0-e2c29af0728c","Type":"ContainerStarted","Data":"8fe473c46135eb10513d7f5b83daba8f0e362eeec08b470adbaa8cf41e0a926c"} Dec 09 15:02:43 crc kubenswrapper[4735]: I1209 15:02:43.995271 4735 scope.go:117] "RemoveContainer" containerID="0043e5cf7f63d7f50f6d755fce0445e45b7b241196d07bfdf4ab541569b3aa00" Dec 09 15:02:44 crc kubenswrapper[4735]: I1209 15:02:44.115917 4735 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:44 crc kubenswrapper[4735]: I1209 15:02:44.251758 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0545e019-8dc9-4818-9c4a-6f0c4929b4fc" Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.000096 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9d4nc_156f78f9-e75d-4fe3-92b0-e2c29af0728c/marketplace-operator/1.log" Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.000608 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9d4nc_156f78f9-e75d-4fe3-92b0-e2c29af0728c/marketplace-operator/0.log" Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.000646 4735 generic.go:334] "Generic (PLEG): container finished" podID="156f78f9-e75d-4fe3-92b0-e2c29af0728c" containerID="999387c5a33e621ab6eb94781346d8183b519114e1b7a57b882d8d4f56ab8d4e" exitCode=1 Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.000718 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" event={"ID":"156f78f9-e75d-4fe3-92b0-e2c29af0728c","Type":"ContainerDied","Data":"999387c5a33e621ab6eb94781346d8183b519114e1b7a57b882d8d4f56ab8d4e"} Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.000761 4735 scope.go:117] "RemoveContainer" containerID="0043e5cf7f63d7f50f6d755fce0445e45b7b241196d07bfdf4ab541569b3aa00" Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.000976 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3857fdec-9ac6-41b1-9504-8c256da10835" Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.000997 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3857fdec-9ac6-41b1-9504-8c256da10835" Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.001118 4735 scope.go:117] "RemoveContainer" containerID="999387c5a33e621ab6eb94781346d8183b519114e1b7a57b882d8d4f56ab8d4e" Dec 09 15:02:45 crc kubenswrapper[4735]: E1209 15:02:45.001295 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-9d4nc_openshift-marketplace(156f78f9-e75d-4fe3-92b0-e2c29af0728c)\"" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" podUID="156f78f9-e75d-4fe3-92b0-e2c29af0728c" Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.003233 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0545e019-8dc9-4818-9c4a-6f0c4929b4fc" Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.007640 4735 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://7435cdc156190a26a47d77d4b1ca018cfcd207e8e68b77edfc7d3dad8b942636" Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.007664 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:02:45 crc kubenswrapper[4735]: I1209 15:02:45.018719 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 15:02:46 crc kubenswrapper[4735]: I1209 15:02:46.009128 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9d4nc_156f78f9-e75d-4fe3-92b0-e2c29af0728c/marketplace-operator/1.log" Dec 09 15:02:46 crc kubenswrapper[4735]: I1209 15:02:46.009391 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3857fdec-9ac6-41b1-9504-8c256da10835" Dec 09 15:02:46 crc kubenswrapper[4735]: I1209 15:02:46.009408 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3857fdec-9ac6-41b1-9504-8c256da10835" Dec 09 15:02:46 crc kubenswrapper[4735]: I1209 15:02:46.011709 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="0545e019-8dc9-4818-9c4a-6f0c4929b4fc" Dec 09 15:02:46 crc kubenswrapper[4735]: I1209 15:02:46.527225 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:46 crc kubenswrapper[4735]: I1209 15:02:46.527265 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:46 crc kubenswrapper[4735]: I1209 15:02:46.527558 4735 scope.go:117] "RemoveContainer" containerID="999387c5a33e621ab6eb94781346d8183b519114e1b7a57b882d8d4f56ab8d4e" Dec 09 15:02:46 crc kubenswrapper[4735]: E1209 15:02:46.527713 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-9d4nc_openshift-marketplace(156f78f9-e75d-4fe3-92b0-e2c29af0728c)\"" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" podUID="156f78f9-e75d-4fe3-92b0-e2c29af0728c" Dec 09 15:02:50 crc kubenswrapper[4735]: I1209 15:02:50.563703 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 09 15:02:50 crc kubenswrapper[4735]: I1209 15:02:50.612283 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 09 15:02:50 crc kubenswrapper[4735]: I1209 15:02:50.699141 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 15:02:50 crc kubenswrapper[4735]: I1209 15:02:50.702087 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 15:02:50 crc kubenswrapper[4735]: I1209 15:02:50.937447 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 09 15:02:51 crc kubenswrapper[4735]: I1209 15:02:51.029476 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 09 15:02:51 crc kubenswrapper[4735]: I1209 15:02:51.169899 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 09 15:02:51 crc kubenswrapper[4735]: I1209 15:02:51.476993 4735 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 09 15:02:51 crc kubenswrapper[4735]: I1209 15:02:51.779671 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 09 15:02:51 crc kubenswrapper[4735]: I1209 15:02:51.856020 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 09 15:02:52 crc kubenswrapper[4735]: I1209 15:02:52.100315 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 09 15:02:52 crc kubenswrapper[4735]: I1209 15:02:52.157558 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 09 15:02:52 crc kubenswrapper[4735]: I1209 15:02:52.250351 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 09 15:02:52 crc kubenswrapper[4735]: I1209 15:02:52.515992 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 09 15:02:52 crc kubenswrapper[4735]: I1209 15:02:52.519142 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 09 15:02:52 crc kubenswrapper[4735]: I1209 15:02:52.570778 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 09 15:02:52 crc kubenswrapper[4735]: I1209 15:02:52.612212 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 09 15:02:52 crc kubenswrapper[4735]: I1209 15:02:52.703673 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 09 15:02:53 crc kubenswrapper[4735]: I1209 15:02:53.243551 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 09 15:02:53 crc kubenswrapper[4735]: I1209 15:02:53.304848 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 09 15:02:53 crc kubenswrapper[4735]: I1209 15:02:53.423945 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 09 15:02:54 crc kubenswrapper[4735]: I1209 15:02:54.099496 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 09 15:02:54 crc kubenswrapper[4735]: I1209 15:02:54.318597 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 09 15:02:54 crc kubenswrapper[4735]: I1209 15:02:54.902752 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 09 15:02:54 crc kubenswrapper[4735]: I1209 15:02:54.940152 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 15:02:55 crc kubenswrapper[4735]: I1209 15:02:55.019824 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 09 15:02:55 crc kubenswrapper[4735]: I1209 15:02:55.040213 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 09 15:02:55 crc kubenswrapper[4735]: I1209 15:02:55.296997 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 09 15:02:55 crc kubenswrapper[4735]: I1209 15:02:55.399283 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 09 15:02:55 crc kubenswrapper[4735]: I1209 15:02:55.642170 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 09 15:02:55 crc kubenswrapper[4735]: I1209 15:02:55.946464 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 09 15:02:56 crc kubenswrapper[4735]: I1209 15:02:56.029676 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 09 15:02:56 crc kubenswrapper[4735]: I1209 15:02:56.630228 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 09 15:02:56 crc kubenswrapper[4735]: I1209 15:02:56.634060 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 09 15:02:56 crc kubenswrapper[4735]: I1209 15:02:56.751373 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 09 15:02:56 crc kubenswrapper[4735]: I1209 15:02:56.881606 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 09 15:02:57 crc kubenswrapper[4735]: I1209 15:02:57.414317 4735 scope.go:117] "RemoveContainer" containerID="999387c5a33e621ab6eb94781346d8183b519114e1b7a57b882d8d4f56ab8d4e" Dec 09 15:02:57 crc kubenswrapper[4735]: I1209 15:02:57.422365 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 09 15:02:57 crc kubenswrapper[4735]: I1209 15:02:57.633254 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 09 15:02:57 crc kubenswrapper[4735]: I1209 15:02:57.680736 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 09 15:02:58 crc kubenswrapper[4735]: I1209 15:02:58.050831 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9d4nc_156f78f9-e75d-4fe3-92b0-e2c29af0728c/marketplace-operator/1.log" Dec 09 15:02:58 crc kubenswrapper[4735]: I1209 15:02:58.050886 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" event={"ID":"156f78f9-e75d-4fe3-92b0-e2c29af0728c","Type":"ContainerStarted","Data":"93b532c100dd19a9ef7185e36dddf31ba3f9a62f2fd9920e7496f0fe18dfaa57"} Dec 09 15:02:58 crc kubenswrapper[4735]: I1209 15:02:58.051176 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:58 crc kubenswrapper[4735]: I1209 15:02:58.052538 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" Dec 09 15:02:58 crc kubenswrapper[4735]: I1209 15:02:58.375164 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 09 15:02:58 crc kubenswrapper[4735]: I1209 15:02:58.425841 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 09 15:02:58 crc kubenswrapper[4735]: I1209 15:02:58.720772 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 09 15:02:58 crc kubenswrapper[4735]: I1209 15:02:58.777558 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 09 15:02:58 crc kubenswrapper[4735]: I1209 15:02:58.939452 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 09 15:02:59 crc kubenswrapper[4735]: I1209 15:02:59.060693 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 09 15:02:59 crc kubenswrapper[4735]: I1209 15:02:59.191605 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 09 15:02:59 crc kubenswrapper[4735]: I1209 15:02:59.322001 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 09 15:02:59 crc kubenswrapper[4735]: I1209 15:02:59.344690 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 15:02:59 crc kubenswrapper[4735]: I1209 15:02:59.546746 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 09 15:02:59 crc kubenswrapper[4735]: I1209 15:02:59.665619 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 09 15:02:59 crc kubenswrapper[4735]: I1209 15:02:59.759673 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 09 15:02:59 crc kubenswrapper[4735]: I1209 15:02:59.884342 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 09 15:02:59 crc kubenswrapper[4735]: I1209 15:02:59.947280 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 09 15:02:59 crc kubenswrapper[4735]: I1209 15:02:59.989908 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 09 15:03:00 crc kubenswrapper[4735]: I1209 15:03:00.004271 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 09 15:03:00 crc kubenswrapper[4735]: I1209 15:03:00.012979 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 09 15:03:00 crc kubenswrapper[4735]: I1209 15:03:00.035537 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 09 15:03:00 crc kubenswrapper[4735]: I1209 15:03:00.112854 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 09 15:03:00 crc kubenswrapper[4735]: I1209 15:03:00.188264 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 09 15:03:00 crc kubenswrapper[4735]: I1209 15:03:00.569894 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 09 15:03:00 crc kubenswrapper[4735]: I1209 15:03:00.769968 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.011843 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.079387 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.087158 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.110239 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.241176 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.249124 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.262026 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.278200 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.491755 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.531014 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.545160 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.560573 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.732178 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.834255 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 09 15:03:01 crc kubenswrapper[4735]: I1209 15:03:01.924698 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.010497 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.025465 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.057612 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.185373 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.214450 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.309604 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.398381 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.417570 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.434370 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.529853 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.550685 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.652384 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.840692 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.847560 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.850090 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.875053 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.955116 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.985240 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 15:03:02 crc kubenswrapper[4735]: I1209 15:03:02.992152 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.058488 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.062604 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.099882 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.135079 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.293476 4735 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.301200 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.362407 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.396314 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.514551 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.555705 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.586790 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.717368 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.822949 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.888283 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.915026 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 09 15:03:03 crc kubenswrapper[4735]: I1209 15:03:03.981427 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.064710 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.103781 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.111042 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.153946 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.230005 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.353679 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.363706 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.393552 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.401273 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.551621 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.551681 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.561296 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.565108 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.634846 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.634893 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.728162 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.916232 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.947840 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 09 15:03:04 crc kubenswrapper[4735]: I1209 15:03:04.972319 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 09 15:03:05 crc kubenswrapper[4735]: I1209 15:03:05.162696 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 09 15:03:05 crc kubenswrapper[4735]: I1209 15:03:05.198191 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 09 15:03:05 crc kubenswrapper[4735]: I1209 15:03:05.244379 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 09 15:03:05 crc kubenswrapper[4735]: I1209 15:03:05.268971 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 15:03:05 crc kubenswrapper[4735]: I1209 15:03:05.464799 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 09 15:03:05 crc kubenswrapper[4735]: I1209 15:03:05.649064 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 09 15:03:05 crc kubenswrapper[4735]: I1209 15:03:05.763488 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 09 15:03:05 crc kubenswrapper[4735]: I1209 15:03:05.771380 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 09 15:03:05 crc kubenswrapper[4735]: I1209 15:03:05.867744 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 09 15:03:05 crc kubenswrapper[4735]: I1209 15:03:05.923625 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.032477 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.058086 4735 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.059431 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9d4nc" podStartSLOduration=40.059417129 podStartE2EDuration="40.059417129s" podCreationTimestamp="2025-12-09 15:02:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:02:58.061854268 +0000 UTC m=+256.986692896" watchObservedRunningTime="2025-12-09 15:03:06.059417129 +0000 UTC m=+264.984255757" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.061299 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-78zr9","openshift-marketplace/certified-operators-xmc6l","openshift-marketplace/redhat-marketplace-68nhq","openshift-marketplace/redhat-operators-4mdmq","openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/marketplace-operator-79b997595-rxhld"] Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.061361 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.061377 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9d4nc"] Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.064795 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.091549 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.091539996 podStartE2EDuration="22.091539996s" podCreationTimestamp="2025-12-09 15:02:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:03:06.077507135 +0000 UTC m=+265.002345763" watchObservedRunningTime="2025-12-09 15:03:06.091539996 +0000 UTC m=+265.016378624" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.277362 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.334962 4735 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.355986 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.407768 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.418220 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.428127 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.447396 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.459372 4735 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.499130 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.518852 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.602624 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.685166 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.697971 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.899440 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 09 15:03:06 crc kubenswrapper[4735]: I1209 15:03:06.977480 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.002075 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.130581 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.260290 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.290131 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.317272 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.328930 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.348846 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.350075 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.379278 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.396898 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.421085 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" path="/var/lib/kubelet/pods/7ea62e8b-5dc1-4527-8279-2845d3666202/volumes" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.421843 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" path="/var/lib/kubelet/pods/7ef71104-8a24-4bf0-9be4-f107d3b8f6e8/volumes" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.422481 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" path="/var/lib/kubelet/pods/bc0a0b6f-c304-4350-ad88-813e4637cad7/volumes" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.423977 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" path="/var/lib/kubelet/pods/f273a98e-41b2-45ac-b140-8c73aaeeed54/volumes" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.424684 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" path="/var/lib/kubelet/pods/fc05d12f-eaa9-48ae-b280-e449caed078c/volumes" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.492870 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.538005 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.705038 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.710150 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.769304 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.770126 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.787293 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.807666 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.925160 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.949402 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 09 15:03:07 crc kubenswrapper[4735]: I1209 15:03:07.950751 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 09 15:03:08 crc kubenswrapper[4735]: I1209 15:03:08.003870 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 15:03:08 crc kubenswrapper[4735]: I1209 15:03:08.040457 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 09 15:03:08 crc kubenswrapper[4735]: I1209 15:03:08.092286 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 09 15:03:08 crc kubenswrapper[4735]: I1209 15:03:08.179409 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 15:03:08 crc kubenswrapper[4735]: I1209 15:03:08.332815 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 09 15:03:08 crc kubenswrapper[4735]: I1209 15:03:08.556931 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 09 15:03:08 crc kubenswrapper[4735]: I1209 15:03:08.557559 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 09 15:03:08 crc kubenswrapper[4735]: I1209 15:03:08.581116 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 09 15:03:08 crc kubenswrapper[4735]: I1209 15:03:08.681149 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 09 15:03:08 crc kubenswrapper[4735]: I1209 15:03:08.745578 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 09 15:03:08 crc kubenswrapper[4735]: I1209 15:03:08.869534 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.031587 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.095103 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.150404 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.172441 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.184772 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.191434 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.317390 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.347623 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.384634 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.439572 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.544151 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.602879 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.640740 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.882999 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.947374 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 09 15:03:09 crc kubenswrapper[4735]: I1209 15:03:09.987083 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.101041 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.121281 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.336902 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.367279 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.389020 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.485270 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.511921 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.518866 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.524475 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.550672 4735 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.585754 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.707646 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.713300 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.744284 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.798872 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.825752 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.899656 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.954716 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 09 15:03:10 crc kubenswrapper[4735]: I1209 15:03:10.971625 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.072298 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.072599 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.078477 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.241662 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.281833 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.393872 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.501313 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.597546 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.616054 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.691441 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.696595 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 09 15:03:11 crc kubenswrapper[4735]: I1209 15:03:11.745382 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 09 15:03:12 crc kubenswrapper[4735]: I1209 15:03:12.098599 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 09 15:03:12 crc kubenswrapper[4735]: I1209 15:03:12.225076 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 09 15:03:12 crc kubenswrapper[4735]: I1209 15:03:12.275846 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 09 15:03:12 crc kubenswrapper[4735]: I1209 15:03:12.832713 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 09 15:03:12 crc kubenswrapper[4735]: I1209 15:03:12.969218 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 09 15:03:13 crc kubenswrapper[4735]: I1209 15:03:13.024959 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 09 15:03:13 crc kubenswrapper[4735]: I1209 15:03:13.196142 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 09 15:03:13 crc kubenswrapper[4735]: I1209 15:03:13.218606 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 09 15:03:13 crc kubenswrapper[4735]: I1209 15:03:13.334369 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 09 15:03:13 crc kubenswrapper[4735]: I1209 15:03:13.475868 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 09 15:03:13 crc kubenswrapper[4735]: I1209 15:03:13.906046 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 09 15:03:14 crc kubenswrapper[4735]: I1209 15:03:14.003636 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 09 15:03:14 crc kubenswrapper[4735]: I1209 15:03:14.007656 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 09 15:03:14 crc kubenswrapper[4735]: I1209 15:03:14.344871 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 09 15:03:14 crc kubenswrapper[4735]: I1209 15:03:14.525443 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 09 15:03:16 crc kubenswrapper[4735]: I1209 15:03:16.913221 4735 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 09 15:03:16 crc kubenswrapper[4735]: I1209 15:03:16.913627 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://a7ab2338126747d7bfd69f1ac870d3d92601ff836c14b976d1d0b1a497df04a4" gracePeriod=5 Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.146246 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.146504 4735 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="a7ab2338126747d7bfd69f1ac870d3d92601ff836c14b976d1d0b1a497df04a4" exitCode=137 Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.458324 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.458380 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.594403 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.594606 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.594714 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.594452 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.594790 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.594798 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.594871 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.594970 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.595014 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.595401 4735 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.595698 4735 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.595718 4735 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.595729 4735 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.600844 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:03:22 crc kubenswrapper[4735]: I1209 15:03:22.696299 4735 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:23 crc kubenswrapper[4735]: I1209 15:03:23.152324 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 09 15:03:23 crc kubenswrapper[4735]: I1209 15:03:23.152387 4735 scope.go:117] "RemoveContainer" containerID="a7ab2338126747d7bfd69f1ac870d3d92601ff836c14b976d1d0b1a497df04a4" Dec 09 15:03:23 crc kubenswrapper[4735]: I1209 15:03:23.152436 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 09 15:03:23 crc kubenswrapper[4735]: I1209 15:03:23.418788 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 09 15:03:28 crc kubenswrapper[4735]: I1209 15:03:28.834735 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fpbzm"] Dec 09 15:03:28 crc kubenswrapper[4735]: I1209 15:03:28.835337 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" podUID="b28eb5dd-5f60-4d02-8972-99cba02cb1c8" containerName="controller-manager" containerID="cri-o://416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783" gracePeriod=30 Dec 09 15:03:28 crc kubenswrapper[4735]: I1209 15:03:28.944065 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s"] Dec 09 15:03:28 crc kubenswrapper[4735]: I1209 15:03:28.944271 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" podUID="2ac48d56-9f89-48f7-8840-48d2761beb97" containerName="route-controller-manager" containerID="cri-o://f39fdf014facbf2dc8816e1be6d0b7670172a884b0b8349e58e50f8f19b9d19a" gracePeriod=30 Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.110737 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.174322 4735 generic.go:334] "Generic (PLEG): container finished" podID="2ac48d56-9f89-48f7-8840-48d2761beb97" containerID="f39fdf014facbf2dc8816e1be6d0b7670172a884b0b8349e58e50f8f19b9d19a" exitCode=0 Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.174412 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" event={"ID":"2ac48d56-9f89-48f7-8840-48d2761beb97","Type":"ContainerDied","Data":"f39fdf014facbf2dc8816e1be6d0b7670172a884b0b8349e58e50f8f19b9d19a"} Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.175674 4735 generic.go:334] "Generic (PLEG): container finished" podID="b28eb5dd-5f60-4d02-8972-99cba02cb1c8" containerID="416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783" exitCode=0 Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.175707 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" event={"ID":"b28eb5dd-5f60-4d02-8972-99cba02cb1c8","Type":"ContainerDied","Data":"416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783"} Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.175714 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.175733 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fpbzm" event={"ID":"b28eb5dd-5f60-4d02-8972-99cba02cb1c8","Type":"ContainerDied","Data":"bfe78a9e505506724f7301ecebf4f46a74bb9a5314396715b0e173dbca1f3440"} Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.175749 4735 scope.go:117] "RemoveContainer" containerID="416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.187761 4735 scope.go:117] "RemoveContainer" containerID="416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783" Dec 09 15:03:29 crc kubenswrapper[4735]: E1209 15:03:29.188158 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783\": container with ID starting with 416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783 not found: ID does not exist" containerID="416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.188182 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783"} err="failed to get container status \"416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783\": rpc error: code = NotFound desc = could not find container \"416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783\": container with ID starting with 416bdd9c22385e250a9f7141231f20ae8601dcda7047401088c84de901649783 not found: ID does not exist" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.212868 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.257341 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-client-ca\") pod \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.257543 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-config\") pod \"2ac48d56-9f89-48f7-8840-48d2761beb97\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.257653 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-config\") pod \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.257746 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-client-ca\") pod \"2ac48d56-9f89-48f7-8840-48d2761beb97\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.257844 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-serving-cert\") pod \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.257920 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-proxy-ca-bundles\") pod \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.258007 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqwsr\" (UniqueName: \"kubernetes.io/projected/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-kube-api-access-dqwsr\") pod \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\" (UID: \"b28eb5dd-5f60-4d02-8972-99cba02cb1c8\") " Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.258087 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzrwj\" (UniqueName: \"kubernetes.io/projected/2ac48d56-9f89-48f7-8840-48d2761beb97-kube-api-access-jzrwj\") pod \"2ac48d56-9f89-48f7-8840-48d2761beb97\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.258151 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-client-ca" (OuterVolumeSpecName: "client-ca") pod "b28eb5dd-5f60-4d02-8972-99cba02cb1c8" (UID: "b28eb5dd-5f60-4d02-8972-99cba02cb1c8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.258194 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-config" (OuterVolumeSpecName: "config") pod "b28eb5dd-5f60-4d02-8972-99cba02cb1c8" (UID: "b28eb5dd-5f60-4d02-8972-99cba02cb1c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.258330 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-client-ca" (OuterVolumeSpecName: "client-ca") pod "2ac48d56-9f89-48f7-8840-48d2761beb97" (UID: "2ac48d56-9f89-48f7-8840-48d2761beb97"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.258355 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b28eb5dd-5f60-4d02-8972-99cba02cb1c8" (UID: "b28eb5dd-5f60-4d02-8972-99cba02cb1c8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.258490 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.258582 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.258655 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.258720 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.258611 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-config" (OuterVolumeSpecName: "config") pod "2ac48d56-9f89-48f7-8840-48d2761beb97" (UID: "2ac48d56-9f89-48f7-8840-48d2761beb97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.262146 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b28eb5dd-5f60-4d02-8972-99cba02cb1c8" (UID: "b28eb5dd-5f60-4d02-8972-99cba02cb1c8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.262163 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ac48d56-9f89-48f7-8840-48d2761beb97-kube-api-access-jzrwj" (OuterVolumeSpecName: "kube-api-access-jzrwj") pod "2ac48d56-9f89-48f7-8840-48d2761beb97" (UID: "2ac48d56-9f89-48f7-8840-48d2761beb97"). InnerVolumeSpecName "kube-api-access-jzrwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.262206 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-kube-api-access-dqwsr" (OuterVolumeSpecName: "kube-api-access-dqwsr") pod "b28eb5dd-5f60-4d02-8972-99cba02cb1c8" (UID: "b28eb5dd-5f60-4d02-8972-99cba02cb1c8"). InnerVolumeSpecName "kube-api-access-dqwsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.359351 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ac48d56-9f89-48f7-8840-48d2761beb97-serving-cert\") pod \"2ac48d56-9f89-48f7-8840-48d2761beb97\" (UID: \"2ac48d56-9f89-48f7-8840-48d2761beb97\") " Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.359583 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac48d56-9f89-48f7-8840-48d2761beb97-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.359602 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.359610 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqwsr\" (UniqueName: \"kubernetes.io/projected/b28eb5dd-5f60-4d02-8972-99cba02cb1c8-kube-api-access-dqwsr\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.359621 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzrwj\" (UniqueName: \"kubernetes.io/projected/2ac48d56-9f89-48f7-8840-48d2761beb97-kube-api-access-jzrwj\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.360934 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ac48d56-9f89-48f7-8840-48d2761beb97-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2ac48d56-9f89-48f7-8840-48d2761beb97" (UID: "2ac48d56-9f89-48f7-8840-48d2761beb97"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.461219 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ac48d56-9f89-48f7-8840-48d2761beb97-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.486815 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fpbzm"] Dec 09 15:03:29 crc kubenswrapper[4735]: I1209 15:03:29.488875 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fpbzm"] Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055657 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c59b458cf-ff2pt"] Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.055853 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055866 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.055875 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerName="extract-content" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055883 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerName="extract-content" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.055893 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" containerName="extract-content" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055898 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" containerName="extract-content" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.055905 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055910 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.055917 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055923 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.055932 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" containerName="extract-content" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055937 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" containerName="extract-content" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.055943 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28eb5dd-5f60-4d02-8972-99cba02cb1c8" containerName="controller-manager" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055949 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28eb5dd-5f60-4d02-8972-99cba02cb1c8" containerName="controller-manager" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.055955 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" containerName="installer" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055961 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" containerName="installer" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.055967 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerName="extract-utilities" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055973 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerName="extract-utilities" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.055981 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055986 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.055993 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" containerName="extract-utilities" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.055998 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" containerName="extract-utilities" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.056005 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056010 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.056017 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ac48d56-9f89-48f7-8840-48d2761beb97" containerName="route-controller-manager" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056022 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ac48d56-9f89-48f7-8840-48d2761beb97" containerName="route-controller-manager" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.056029 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" containerName="extract-content" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056034 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" containerName="extract-content" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.056040 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" containerName="marketplace-operator" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056046 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" containerName="marketplace-operator" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.056052 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" containerName="extract-utilities" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056058 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" containerName="extract-utilities" Dec 09 15:03:30 crc kubenswrapper[4735]: E1209 15:03:30.056069 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" containerName="extract-utilities" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056074 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" containerName="extract-utilities" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056160 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28eb5dd-5f60-4d02-8972-99cba02cb1c8" containerName="controller-manager" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056173 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc0a0b6f-c304-4350-ad88-813e4637cad7" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056182 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="508b0d5f-7739-45b0-be00-16110455e3e3" containerName="installer" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056189 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef71104-8a24-4bf0-9be4-f107d3b8f6e8" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056194 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ac48d56-9f89-48f7-8840-48d2761beb97" containerName="route-controller-manager" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056200 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f273a98e-41b2-45ac-b140-8c73aaeeed54" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056209 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056217 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc05d12f-eaa9-48ae-b280-e449caed078c" containerName="marketplace-operator" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056224 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea62e8b-5dc1-4527-8279-2845d3666202" containerName="registry-server" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.056560 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.057883 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg"] Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.057967 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.057986 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.058185 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.058570 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.058890 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.058896 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.058981 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.063680 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b458cf-ff2pt"] Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.065905 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg"] Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.067008 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-proxy-ca-bundles\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.067039 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-config\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.067057 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-client-ca\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.067070 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8238321-a193-4ac5-b027-b8e3f8962ebe-serving-cert\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.067115 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kvd6\" (UniqueName: \"kubernetes.io/projected/c8238321-a193-4ac5-b027-b8e3f8962ebe-kube-api-access-7kvd6\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.067123 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.067158 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w6gc\" (UniqueName: \"kubernetes.io/projected/3df48875-e111-4fa9-ac81-e3ae198c221a-kube-api-access-4w6gc\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.067174 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-client-ca\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.067206 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3df48875-e111-4fa9-ac81-e3ae198c221a-serving-cert\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.067225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-config\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.087218 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sn6kd"] Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.088021 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.089819 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.095645 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sn6kd"] Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.167895 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnwvk\" (UniqueName: \"kubernetes.io/projected/9fd06704-c0be-460b-ad6d-7d976889607e-kube-api-access-rnwvk\") pod \"certified-operators-sn6kd\" (UID: \"9fd06704-c0be-460b-ad6d-7d976889607e\") " pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.167938 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3df48875-e111-4fa9-ac81-e3ae198c221a-serving-cert\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.167963 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-config\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.168030 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-proxy-ca-bundles\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.168060 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd06704-c0be-460b-ad6d-7d976889607e-catalog-content\") pod \"certified-operators-sn6kd\" (UID: \"9fd06704-c0be-460b-ad6d-7d976889607e\") " pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.168108 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd06704-c0be-460b-ad6d-7d976889607e-utilities\") pod \"certified-operators-sn6kd\" (UID: \"9fd06704-c0be-460b-ad6d-7d976889607e\") " pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.168144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-config\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.168163 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-client-ca\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.168178 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8238321-a193-4ac5-b027-b8e3f8962ebe-serving-cert\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.168234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kvd6\" (UniqueName: \"kubernetes.io/projected/c8238321-a193-4ac5-b027-b8e3f8962ebe-kube-api-access-7kvd6\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.168261 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w6gc\" (UniqueName: \"kubernetes.io/projected/3df48875-e111-4fa9-ac81-e3ae198c221a-kube-api-access-4w6gc\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.168276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-client-ca\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.169015 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-proxy-ca-bundles\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.169034 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-client-ca\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.169110 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-client-ca\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.169152 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-config\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.169276 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-config\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.171160 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3df48875-e111-4fa9-ac81-e3ae198c221a-serving-cert\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.171555 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8238321-a193-4ac5-b027-b8e3f8962ebe-serving-cert\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.180825 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" event={"ID":"2ac48d56-9f89-48f7-8840-48d2761beb97","Type":"ContainerDied","Data":"3d4f73c95edd9049f96b942b6f2c905437e519a397c55f691ad84d09113c3160"} Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.180833 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.180861 4735 scope.go:117] "RemoveContainer" containerID="f39fdf014facbf2dc8816e1be6d0b7670172a884b0b8349e58e50f8f19b9d19a" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.181249 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w6gc\" (UniqueName: \"kubernetes.io/projected/3df48875-e111-4fa9-ac81-e3ae198c221a-kube-api-access-4w6gc\") pod \"controller-manager-6c59b458cf-ff2pt\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.181477 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kvd6\" (UniqueName: \"kubernetes.io/projected/c8238321-a193-4ac5-b027-b8e3f8962ebe-kube-api-access-7kvd6\") pod \"route-controller-manager-6cdcb8b5d6-nfqhg\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.201900 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s"] Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.209051 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sqm8s"] Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.269135 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd06704-c0be-460b-ad6d-7d976889607e-catalog-content\") pod \"certified-operators-sn6kd\" (UID: \"9fd06704-c0be-460b-ad6d-7d976889607e\") " pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.269173 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd06704-c0be-460b-ad6d-7d976889607e-utilities\") pod \"certified-operators-sn6kd\" (UID: \"9fd06704-c0be-460b-ad6d-7d976889607e\") " pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.269229 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnwvk\" (UniqueName: \"kubernetes.io/projected/9fd06704-c0be-460b-ad6d-7d976889607e-kube-api-access-rnwvk\") pod \"certified-operators-sn6kd\" (UID: \"9fd06704-c0be-460b-ad6d-7d976889607e\") " pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.269605 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fd06704-c0be-460b-ad6d-7d976889607e-catalog-content\") pod \"certified-operators-sn6kd\" (UID: \"9fd06704-c0be-460b-ad6d-7d976889607e\") " pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.270297 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fd06704-c0be-460b-ad6d-7d976889607e-utilities\") pod \"certified-operators-sn6kd\" (UID: \"9fd06704-c0be-460b-ad6d-7d976889607e\") " pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.284803 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hh9r9"] Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.285751 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.287077 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnwvk\" (UniqueName: \"kubernetes.io/projected/9fd06704-c0be-460b-ad6d-7d976889607e-kube-api-access-rnwvk\") pod \"certified-operators-sn6kd\" (UID: \"9fd06704-c0be-460b-ad6d-7d976889607e\") " pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.287681 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.291868 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hh9r9"] Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.368260 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.370137 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60-utilities\") pod \"redhat-marketplace-hh9r9\" (UID: \"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60\") " pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.370166 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60-catalog-content\") pod \"redhat-marketplace-hh9r9\" (UID: \"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60\") " pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.370220 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmnhf\" (UniqueName: \"kubernetes.io/projected/b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60-kube-api-access-fmnhf\") pod \"redhat-marketplace-hh9r9\" (UID: \"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60\") " pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.373390 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.397447 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.470875 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60-utilities\") pod \"redhat-marketplace-hh9r9\" (UID: \"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60\") " pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.470910 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60-catalog-content\") pod \"redhat-marketplace-hh9r9\" (UID: \"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60\") " pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.470982 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmnhf\" (UniqueName: \"kubernetes.io/projected/b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60-kube-api-access-fmnhf\") pod \"redhat-marketplace-hh9r9\" (UID: \"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60\") " pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.471454 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60-utilities\") pod \"redhat-marketplace-hh9r9\" (UID: \"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60\") " pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.471536 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60-catalog-content\") pod \"redhat-marketplace-hh9r9\" (UID: \"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60\") " pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.485644 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmnhf\" (UniqueName: \"kubernetes.io/projected/b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60-kube-api-access-fmnhf\") pod \"redhat-marketplace-hh9r9\" (UID: \"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60\") " pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.604369 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.702005 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b458cf-ff2pt"] Dec 09 15:03:30 crc kubenswrapper[4735]: W1209 15:03:30.707293 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3df48875_e111_4fa9_ac81_e3ae198c221a.slice/crio-edc7b342e49ed7c1f3fc223f1f3b9b000b33b3f702e0f9753399ea451fa67eb6 WatchSource:0}: Error finding container edc7b342e49ed7c1f3fc223f1f3b9b000b33b3f702e0f9753399ea451fa67eb6: Status 404 returned error can't find the container with id edc7b342e49ed7c1f3fc223f1f3b9b000b33b3f702e0f9753399ea451fa67eb6 Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.731214 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg"] Dec 09 15:03:30 crc kubenswrapper[4735]: W1209 15:03:30.733971 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8238321_a193_4ac5_b027_b8e3f8962ebe.slice/crio-1604d3f6c3e552be1c02cdd0cb303efd462bd9d991783865576d16df7d620d5e WatchSource:0}: Error finding container 1604d3f6c3e552be1c02cdd0cb303efd462bd9d991783865576d16df7d620d5e: Status 404 returned error can't find the container with id 1604d3f6c3e552be1c02cdd0cb303efd462bd9d991783865576d16df7d620d5e Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.760258 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sn6kd"] Dec 09 15:03:30 crc kubenswrapper[4735]: W1209 15:03:30.775719 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fd06704_c0be_460b_ad6d_7d976889607e.slice/crio-b7ea15fccfb5deab95c47dc338d81a8cb84a53b6593c9bebb3b6fbfd9ae42cdb WatchSource:0}: Error finding container b7ea15fccfb5deab95c47dc338d81a8cb84a53b6593c9bebb3b6fbfd9ae42cdb: Status 404 returned error can't find the container with id b7ea15fccfb5deab95c47dc338d81a8cb84a53b6593c9bebb3b6fbfd9ae42cdb Dec 09 15:03:30 crc kubenswrapper[4735]: I1209 15:03:30.954169 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hh9r9"] Dec 09 15:03:30 crc kubenswrapper[4735]: W1209 15:03:30.967312 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45c7f1f_c09b_4d8a_9bfb_b8dec29f1e60.slice/crio-f6c6c8f46cc36b54d0b8ee553d0d2be6df695f200c471ee3c17473ae33e4c1eb WatchSource:0}: Error finding container f6c6c8f46cc36b54d0b8ee553d0d2be6df695f200c471ee3c17473ae33e4c1eb: Status 404 returned error can't find the container with id f6c6c8f46cc36b54d0b8ee553d0d2be6df695f200c471ee3c17473ae33e4c1eb Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.189004 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" event={"ID":"c8238321-a193-4ac5-b027-b8e3f8962ebe","Type":"ContainerStarted","Data":"f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826"} Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.189039 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" event={"ID":"c8238321-a193-4ac5-b027-b8e3f8962ebe","Type":"ContainerStarted","Data":"1604d3f6c3e552be1c02cdd0cb303efd462bd9d991783865576d16df7d620d5e"} Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.189138 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.190726 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" event={"ID":"3df48875-e111-4fa9-ac81-e3ae198c221a","Type":"ContainerStarted","Data":"6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8"} Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.190762 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" event={"ID":"3df48875-e111-4fa9-ac81-e3ae198c221a","Type":"ContainerStarted","Data":"edc7b342e49ed7c1f3fc223f1f3b9b000b33b3f702e0f9753399ea451fa67eb6"} Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.190845 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.192186 4735 generic.go:334] "Generic (PLEG): container finished" podID="b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60" containerID="5e095f3e204410198724aaf31d533d113d0ad777d1a1dd73857a089a66f7a37d" exitCode=0 Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.192237 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hh9r9" event={"ID":"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60","Type":"ContainerDied","Data":"5e095f3e204410198724aaf31d533d113d0ad777d1a1dd73857a089a66f7a37d"} Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.192252 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hh9r9" event={"ID":"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60","Type":"ContainerStarted","Data":"f6c6c8f46cc36b54d0b8ee553d0d2be6df695f200c471ee3c17473ae33e4c1eb"} Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.193670 4735 generic.go:334] "Generic (PLEG): container finished" podID="9fd06704-c0be-460b-ad6d-7d976889607e" containerID="d6b7d32de331146369ac64beeb1f22fd8057103334e8b0988b4b955128c5bc52" exitCode=0 Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.193696 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn6kd" event={"ID":"9fd06704-c0be-460b-ad6d-7d976889607e","Type":"ContainerDied","Data":"d6b7d32de331146369ac64beeb1f22fd8057103334e8b0988b4b955128c5bc52"} Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.193711 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn6kd" event={"ID":"9fd06704-c0be-460b-ad6d-7d976889607e","Type":"ContainerStarted","Data":"b7ea15fccfb5deab95c47dc338d81a8cb84a53b6593c9bebb3b6fbfd9ae42cdb"} Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.194922 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.201264 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" podStartSLOduration=3.201250541 podStartE2EDuration="3.201250541s" podCreationTimestamp="2025-12-09 15:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:03:31.199558532 +0000 UTC m=+290.124397160" watchObservedRunningTime="2025-12-09 15:03:31.201250541 +0000 UTC m=+290.126089169" Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.218563 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" podStartSLOduration=3.218554058 podStartE2EDuration="3.218554058s" podCreationTimestamp="2025-12-09 15:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:03:31.215879725 +0000 UTC m=+290.140718354" watchObservedRunningTime="2025-12-09 15:03:31.218554058 +0000 UTC m=+290.143392686" Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.269919 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.418769 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ac48d56-9f89-48f7-8840-48d2761beb97" path="/var/lib/kubelet/pods/2ac48d56-9f89-48f7-8840-48d2761beb97/volumes" Dec 09 15:03:31 crc kubenswrapper[4735]: I1209 15:03:31.419253 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28eb5dd-5f60-4d02-8972-99cba02cb1c8" path="/var/lib/kubelet/pods/b28eb5dd-5f60-4d02-8972-99cba02cb1c8/volumes" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.198972 4735 generic.go:334] "Generic (PLEG): container finished" podID="b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60" containerID="2eb82bf97cccb19eb728aa357f010b525366d3a2dee36790bcfeb8e3c0bd9d9e" exitCode=0 Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.199050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hh9r9" event={"ID":"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60","Type":"ContainerDied","Data":"2eb82bf97cccb19eb728aa357f010b525366d3a2dee36790bcfeb8e3c0bd9d9e"} Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.200411 4735 generic.go:334] "Generic (PLEG): container finished" podID="9fd06704-c0be-460b-ad6d-7d976889607e" containerID="8771eed6deaf4f40b985d5e95fec2330135ba78d9174578769ef7d02deaf1a28" exitCode=0 Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.200450 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn6kd" event={"ID":"9fd06704-c0be-460b-ad6d-7d976889607e","Type":"ContainerDied","Data":"8771eed6deaf4f40b985d5e95fec2330135ba78d9174578769ef7d02deaf1a28"} Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.487793 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vzx8d"] Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.488915 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.491011 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.492095 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vzx8d"] Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.594026 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-utilities\") pod \"redhat-operators-vzx8d\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.594124 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-catalog-content\") pod \"redhat-operators-vzx8d\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.594147 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vktp\" (UniqueName: \"kubernetes.io/projected/33564b45-b47d-4cb7-8ff6-fa0226782e59-kube-api-access-8vktp\") pod \"redhat-operators-vzx8d\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.690174 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d2c69"] Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.691900 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.694425 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.695452 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-catalog-content\") pod \"redhat-operators-vzx8d\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.695488 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vktp\" (UniqueName: \"kubernetes.io/projected/33564b45-b47d-4cb7-8ff6-fa0226782e59-kube-api-access-8vktp\") pod \"redhat-operators-vzx8d\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.695629 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-utilities\") pod \"redhat-operators-vzx8d\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.696083 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-catalog-content\") pod \"redhat-operators-vzx8d\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.696182 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-utilities\") pod \"redhat-operators-vzx8d\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.701151 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d2c69"] Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.716613 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vktp\" (UniqueName: \"kubernetes.io/projected/33564b45-b47d-4cb7-8ff6-fa0226782e59-kube-api-access-8vktp\") pod \"redhat-operators-vzx8d\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.796490 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/656d05f6-6e95-4f8a-a360-ae278fd02c3b-catalog-content\") pod \"community-operators-d2c69\" (UID: \"656d05f6-6e95-4f8a-a360-ae278fd02c3b\") " pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.796559 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbh57\" (UniqueName: \"kubernetes.io/projected/656d05f6-6e95-4f8a-a360-ae278fd02c3b-kube-api-access-dbh57\") pod \"community-operators-d2c69\" (UID: \"656d05f6-6e95-4f8a-a360-ae278fd02c3b\") " pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.796608 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/656d05f6-6e95-4f8a-a360-ae278fd02c3b-utilities\") pod \"community-operators-d2c69\" (UID: \"656d05f6-6e95-4f8a-a360-ae278fd02c3b\") " pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.802827 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.897427 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/656d05f6-6e95-4f8a-a360-ae278fd02c3b-utilities\") pod \"community-operators-d2c69\" (UID: \"656d05f6-6e95-4f8a-a360-ae278fd02c3b\") " pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.897674 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/656d05f6-6e95-4f8a-a360-ae278fd02c3b-catalog-content\") pod \"community-operators-d2c69\" (UID: \"656d05f6-6e95-4f8a-a360-ae278fd02c3b\") " pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.897711 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbh57\" (UniqueName: \"kubernetes.io/projected/656d05f6-6e95-4f8a-a360-ae278fd02c3b-kube-api-access-dbh57\") pod \"community-operators-d2c69\" (UID: \"656d05f6-6e95-4f8a-a360-ae278fd02c3b\") " pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.897961 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/656d05f6-6e95-4f8a-a360-ae278fd02c3b-utilities\") pod \"community-operators-d2c69\" (UID: \"656d05f6-6e95-4f8a-a360-ae278fd02c3b\") " pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.898006 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/656d05f6-6e95-4f8a-a360-ae278fd02c3b-catalog-content\") pod \"community-operators-d2c69\" (UID: \"656d05f6-6e95-4f8a-a360-ae278fd02c3b\") " pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:32 crc kubenswrapper[4735]: I1209 15:03:32.914172 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbh57\" (UniqueName: \"kubernetes.io/projected/656d05f6-6e95-4f8a-a360-ae278fd02c3b-kube-api-access-dbh57\") pod \"community-operators-d2c69\" (UID: \"656d05f6-6e95-4f8a-a360-ae278fd02c3b\") " pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:33 crc kubenswrapper[4735]: I1209 15:03:33.004422 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:33 crc kubenswrapper[4735]: I1209 15:03:33.142713 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vzx8d"] Dec 09 15:03:33 crc kubenswrapper[4735]: I1209 15:03:33.209303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzx8d" event={"ID":"33564b45-b47d-4cb7-8ff6-fa0226782e59","Type":"ContainerStarted","Data":"5967e861fd31f4b3a5aef9d7721d54c64f6750913ec3ffb4d224f473916d76f3"} Dec 09 15:03:33 crc kubenswrapper[4735]: I1209 15:03:33.211373 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn6kd" event={"ID":"9fd06704-c0be-460b-ad6d-7d976889607e","Type":"ContainerStarted","Data":"7a112599f428d103490b0e35b4db4980c5b7cd1deb94c9a7c101aa139c74f1db"} Dec 09 15:03:33 crc kubenswrapper[4735]: I1209 15:03:33.215069 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hh9r9" event={"ID":"b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60","Type":"ContainerStarted","Data":"9a8ae5d1d20f0297cfa14e629ef13581f46e705ae8cc7eba82c167d1c2d7b57e"} Dec 09 15:03:33 crc kubenswrapper[4735]: I1209 15:03:33.227232 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sn6kd" podStartSLOduration=1.7195502889999998 podStartE2EDuration="3.227219246s" podCreationTimestamp="2025-12-09 15:03:30 +0000 UTC" firstStartedPulling="2025-12-09 15:03:31.194741277 +0000 UTC m=+290.119579906" lastFinishedPulling="2025-12-09 15:03:32.702410234 +0000 UTC m=+291.627248863" observedRunningTime="2025-12-09 15:03:33.226756892 +0000 UTC m=+292.151595520" watchObservedRunningTime="2025-12-09 15:03:33.227219246 +0000 UTC m=+292.152057873" Dec 09 15:03:33 crc kubenswrapper[4735]: I1209 15:03:33.238853 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hh9r9" podStartSLOduration=1.7267441300000002 podStartE2EDuration="3.238838761s" podCreationTimestamp="2025-12-09 15:03:30 +0000 UTC" firstStartedPulling="2025-12-09 15:03:31.193202705 +0000 UTC m=+290.118041333" lastFinishedPulling="2025-12-09 15:03:32.705297336 +0000 UTC m=+291.630135964" observedRunningTime="2025-12-09 15:03:33.23827102 +0000 UTC m=+292.163109649" watchObservedRunningTime="2025-12-09 15:03:33.238838761 +0000 UTC m=+292.163677390" Dec 09 15:03:33 crc kubenswrapper[4735]: I1209 15:03:33.338246 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d2c69"] Dec 09 15:03:33 crc kubenswrapper[4735]: W1209 15:03:33.343351 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod656d05f6_6e95_4f8a_a360_ae278fd02c3b.slice/crio-dc7bd8eaba41e1ffa02a6440253acae72ce107d43c27c58b0194862ba178f1f1 WatchSource:0}: Error finding container dc7bd8eaba41e1ffa02a6440253acae72ce107d43c27c58b0194862ba178f1f1: Status 404 returned error can't find the container with id dc7bd8eaba41e1ffa02a6440253acae72ce107d43c27c58b0194862ba178f1f1 Dec 09 15:03:33 crc kubenswrapper[4735]: E1209 15:03:33.478904 4735 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod656d05f6_6e95_4f8a_a360_ae278fd02c3b.slice/crio-cd36d96b9dd9056293a5faddccbb750c334d17f9c98d57981af5de37df441bd8.scope\": RecentStats: unable to find data in memory cache]" Dec 09 15:03:34 crc kubenswrapper[4735]: I1209 15:03:34.221322 4735 generic.go:334] "Generic (PLEG): container finished" podID="656d05f6-6e95-4f8a-a360-ae278fd02c3b" containerID="cd36d96b9dd9056293a5faddccbb750c334d17f9c98d57981af5de37df441bd8" exitCode=0 Dec 09 15:03:34 crc kubenswrapper[4735]: I1209 15:03:34.221501 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2c69" event={"ID":"656d05f6-6e95-4f8a-a360-ae278fd02c3b","Type":"ContainerDied","Data":"cd36d96b9dd9056293a5faddccbb750c334d17f9c98d57981af5de37df441bd8"} Dec 09 15:03:34 crc kubenswrapper[4735]: I1209 15:03:34.221797 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2c69" event={"ID":"656d05f6-6e95-4f8a-a360-ae278fd02c3b","Type":"ContainerStarted","Data":"dc7bd8eaba41e1ffa02a6440253acae72ce107d43c27c58b0194862ba178f1f1"} Dec 09 15:03:34 crc kubenswrapper[4735]: I1209 15:03:34.222973 4735 generic.go:334] "Generic (PLEG): container finished" podID="33564b45-b47d-4cb7-8ff6-fa0226782e59" containerID="8f060fae2d793e26a91a68951aa8f9914c3e7fbb43b679be84d28f8a8e4d2c60" exitCode=0 Dec 09 15:03:34 crc kubenswrapper[4735]: I1209 15:03:34.223229 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzx8d" event={"ID":"33564b45-b47d-4cb7-8ff6-fa0226782e59","Type":"ContainerDied","Data":"8f060fae2d793e26a91a68951aa8f9914c3e7fbb43b679be84d28f8a8e4d2c60"} Dec 09 15:03:35 crc kubenswrapper[4735]: I1209 15:03:35.230029 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzx8d" event={"ID":"33564b45-b47d-4cb7-8ff6-fa0226782e59","Type":"ContainerStarted","Data":"b0c41cf55954787b3a7ee4ef5eb62c8a3635c79be3dc3a03d65730f38beef7dd"} Dec 09 15:03:35 crc kubenswrapper[4735]: I1209 15:03:35.231740 4735 generic.go:334] "Generic (PLEG): container finished" podID="656d05f6-6e95-4f8a-a360-ae278fd02c3b" containerID="40d3e40aed23cd6edc423acf4fdcf90052ca740e9f61234de36d636891cf9e22" exitCode=0 Dec 09 15:03:35 crc kubenswrapper[4735]: I1209 15:03:35.231773 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2c69" event={"ID":"656d05f6-6e95-4f8a-a360-ae278fd02c3b","Type":"ContainerDied","Data":"40d3e40aed23cd6edc423acf4fdcf90052ca740e9f61234de36d636891cf9e22"} Dec 09 15:03:36 crc kubenswrapper[4735]: I1209 15:03:36.237971 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2c69" event={"ID":"656d05f6-6e95-4f8a-a360-ae278fd02c3b","Type":"ContainerStarted","Data":"eb2ff2b095b1412426400b9b590621708a33574be142e06cf2fc3a1ed099bbe1"} Dec 09 15:03:36 crc kubenswrapper[4735]: I1209 15:03:36.239658 4735 generic.go:334] "Generic (PLEG): container finished" podID="33564b45-b47d-4cb7-8ff6-fa0226782e59" containerID="b0c41cf55954787b3a7ee4ef5eb62c8a3635c79be3dc3a03d65730f38beef7dd" exitCode=0 Dec 09 15:03:36 crc kubenswrapper[4735]: I1209 15:03:36.239689 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzx8d" event={"ID":"33564b45-b47d-4cb7-8ff6-fa0226782e59","Type":"ContainerDied","Data":"b0c41cf55954787b3a7ee4ef5eb62c8a3635c79be3dc3a03d65730f38beef7dd"} Dec 09 15:03:36 crc kubenswrapper[4735]: I1209 15:03:36.250503 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d2c69" podStartSLOduration=2.678487536 podStartE2EDuration="4.250493018s" podCreationTimestamp="2025-12-09 15:03:32 +0000 UTC" firstStartedPulling="2025-12-09 15:03:34.223244795 +0000 UTC m=+293.148083423" lastFinishedPulling="2025-12-09 15:03:35.795250277 +0000 UTC m=+294.720088905" observedRunningTime="2025-12-09 15:03:36.249210353 +0000 UTC m=+295.174048982" watchObservedRunningTime="2025-12-09 15:03:36.250493018 +0000 UTC m=+295.175331646" Dec 09 15:03:37 crc kubenswrapper[4735]: I1209 15:03:37.246216 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzx8d" event={"ID":"33564b45-b47d-4cb7-8ff6-fa0226782e59","Type":"ContainerStarted","Data":"655dc98f098b95b2a374222184d565e99cdcddc9380a354d96df60bb040d058b"} Dec 09 15:03:37 crc kubenswrapper[4735]: I1209 15:03:37.258774 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vzx8d" podStartSLOduration=2.737465226 podStartE2EDuration="5.258760995s" podCreationTimestamp="2025-12-09 15:03:32 +0000 UTC" firstStartedPulling="2025-12-09 15:03:34.225262221 +0000 UTC m=+293.150100849" lastFinishedPulling="2025-12-09 15:03:36.746557991 +0000 UTC m=+295.671396618" observedRunningTime="2025-12-09 15:03:37.258176413 +0000 UTC m=+296.183015031" watchObservedRunningTime="2025-12-09 15:03:37.258760995 +0000 UTC m=+296.183599623" Dec 09 15:03:40 crc kubenswrapper[4735]: I1209 15:03:40.397819 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:40 crc kubenswrapper[4735]: I1209 15:03:40.398055 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:40 crc kubenswrapper[4735]: I1209 15:03:40.426439 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:40 crc kubenswrapper[4735]: I1209 15:03:40.605057 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:40 crc kubenswrapper[4735]: I1209 15:03:40.605094 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:40 crc kubenswrapper[4735]: I1209 15:03:40.630678 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:41 crc kubenswrapper[4735]: I1209 15:03:41.287083 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hh9r9" Dec 09 15:03:41 crc kubenswrapper[4735]: I1209 15:03:41.287564 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sn6kd" Dec 09 15:03:42 crc kubenswrapper[4735]: I1209 15:03:42.803560 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:42 crc kubenswrapper[4735]: I1209 15:03:42.803802 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:42 crc kubenswrapper[4735]: I1209 15:03:42.829176 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:43 crc kubenswrapper[4735]: I1209 15:03:43.005348 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:43 crc kubenswrapper[4735]: I1209 15:03:43.005563 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:43 crc kubenswrapper[4735]: I1209 15:03:43.030019 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:43 crc kubenswrapper[4735]: I1209 15:03:43.293571 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:03:43 crc kubenswrapper[4735]: I1209 15:03:43.294446 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d2c69" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.281669 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b458cf-ff2pt"] Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.281849 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" podUID="3df48875-e111-4fa9-ac81-e3ae198c221a" containerName="controller-manager" containerID="cri-o://6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8" gracePeriod=30 Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.368049 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg"] Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.368418 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" podUID="c8238321-a193-4ac5-b027-b8e3f8962ebe" containerName="route-controller-manager" containerID="cri-o://f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826" gracePeriod=30 Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.724970 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.777016 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.876733 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8238321-a193-4ac5-b027-b8e3f8962ebe-serving-cert\") pod \"c8238321-a193-4ac5-b027-b8e3f8962ebe\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.876795 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-config\") pod \"c8238321-a193-4ac5-b027-b8e3f8962ebe\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.876833 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kvd6\" (UniqueName: \"kubernetes.io/projected/c8238321-a193-4ac5-b027-b8e3f8962ebe-kube-api-access-7kvd6\") pod \"c8238321-a193-4ac5-b027-b8e3f8962ebe\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.876957 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-client-ca\") pod \"c8238321-a193-4ac5-b027-b8e3f8962ebe\" (UID: \"c8238321-a193-4ac5-b027-b8e3f8962ebe\") " Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.877631 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-client-ca" (OuterVolumeSpecName: "client-ca") pod "c8238321-a193-4ac5-b027-b8e3f8962ebe" (UID: "c8238321-a193-4ac5-b027-b8e3f8962ebe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.877936 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-config" (OuterVolumeSpecName: "config") pod "c8238321-a193-4ac5-b027-b8e3f8962ebe" (UID: "c8238321-a193-4ac5-b027-b8e3f8962ebe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.882875 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8238321-a193-4ac5-b027-b8e3f8962ebe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c8238321-a193-4ac5-b027-b8e3f8962ebe" (UID: "c8238321-a193-4ac5-b027-b8e3f8962ebe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.883138 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8238321-a193-4ac5-b027-b8e3f8962ebe-kube-api-access-7kvd6" (OuterVolumeSpecName: "kube-api-access-7kvd6") pod "c8238321-a193-4ac5-b027-b8e3f8962ebe" (UID: "c8238321-a193-4ac5-b027-b8e3f8962ebe"). InnerVolumeSpecName "kube-api-access-7kvd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.978170 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3df48875-e111-4fa9-ac81-e3ae198c221a-serving-cert\") pod \"3df48875-e111-4fa9-ac81-e3ae198c221a\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.978244 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-config\") pod \"3df48875-e111-4fa9-ac81-e3ae198c221a\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.978269 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-proxy-ca-bundles\") pod \"3df48875-e111-4fa9-ac81-e3ae198c221a\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.978333 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-client-ca\") pod \"3df48875-e111-4fa9-ac81-e3ae198c221a\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.978351 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w6gc\" (UniqueName: \"kubernetes.io/projected/3df48875-e111-4fa9-ac81-e3ae198c221a-kube-api-access-4w6gc\") pod \"3df48875-e111-4fa9-ac81-e3ae198c221a\" (UID: \"3df48875-e111-4fa9-ac81-e3ae198c221a\") " Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.978481 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8238321-a193-4ac5-b027-b8e3f8962ebe-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.978499 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.978508 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kvd6\" (UniqueName: \"kubernetes.io/projected/c8238321-a193-4ac5-b027-b8e3f8962ebe-kube-api-access-7kvd6\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.978539 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8238321-a193-4ac5-b027-b8e3f8962ebe-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.978914 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-config" (OuterVolumeSpecName: "config") pod "3df48875-e111-4fa9-ac81-e3ae198c221a" (UID: "3df48875-e111-4fa9-ac81-e3ae198c221a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.979166 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-client-ca" (OuterVolumeSpecName: "client-ca") pod "3df48875-e111-4fa9-ac81-e3ae198c221a" (UID: "3df48875-e111-4fa9-ac81-e3ae198c221a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.979341 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3df48875-e111-4fa9-ac81-e3ae198c221a" (UID: "3df48875-e111-4fa9-ac81-e3ae198c221a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.981002 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df48875-e111-4fa9-ac81-e3ae198c221a-kube-api-access-4w6gc" (OuterVolumeSpecName: "kube-api-access-4w6gc") pod "3df48875-e111-4fa9-ac81-e3ae198c221a" (UID: "3df48875-e111-4fa9-ac81-e3ae198c221a"). InnerVolumeSpecName "kube-api-access-4w6gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:03:49 crc kubenswrapper[4735]: I1209 15:03:49.981866 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df48875-e111-4fa9-ac81-e3ae198c221a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3df48875-e111-4fa9-ac81-e3ae198c221a" (UID: "3df48875-e111-4fa9-ac81-e3ae198c221a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.079492 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.079537 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.079549 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3df48875-e111-4fa9-ac81-e3ae198c221a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.079558 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w6gc\" (UniqueName: \"kubernetes.io/projected/3df48875-e111-4fa9-ac81-e3ae198c221a-kube-api-access-4w6gc\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.079567 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3df48875-e111-4fa9-ac81-e3ae198c221a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.293755 4735 generic.go:334] "Generic (PLEG): container finished" podID="c8238321-a193-4ac5-b027-b8e3f8962ebe" containerID="f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826" exitCode=0 Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.293849 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.294193 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" event={"ID":"c8238321-a193-4ac5-b027-b8e3f8962ebe","Type":"ContainerDied","Data":"f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826"} Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.294231 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg" event={"ID":"c8238321-a193-4ac5-b027-b8e3f8962ebe","Type":"ContainerDied","Data":"1604d3f6c3e552be1c02cdd0cb303efd462bd9d991783865576d16df7d620d5e"} Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.294253 4735 scope.go:117] "RemoveContainer" containerID="f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.296826 4735 generic.go:334] "Generic (PLEG): container finished" podID="3df48875-e111-4fa9-ac81-e3ae198c221a" containerID="6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8" exitCode=0 Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.296854 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" event={"ID":"3df48875-e111-4fa9-ac81-e3ae198c221a","Type":"ContainerDied","Data":"6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8"} Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.296869 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" event={"ID":"3df48875-e111-4fa9-ac81-e3ae198c221a","Type":"ContainerDied","Data":"edc7b342e49ed7c1f3fc223f1f3b9b000b33b3f702e0f9753399ea451fa67eb6"} Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.296889 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b458cf-ff2pt" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.307972 4735 scope.go:117] "RemoveContainer" containerID="f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826" Dec 09 15:03:50 crc kubenswrapper[4735]: E1209 15:03:50.308287 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826\": container with ID starting with f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826 not found: ID does not exist" containerID="f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.308327 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826"} err="failed to get container status \"f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826\": rpc error: code = NotFound desc = could not find container \"f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826\": container with ID starting with f6c659b4b35d4b08b346ae567771539dcdcb5c3ab1f5dfdcbb37585e82e80826 not found: ID does not exist" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.308349 4735 scope.go:117] "RemoveContainer" containerID="6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.315415 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg"] Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.317723 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cdcb8b5d6-nfqhg"] Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.323169 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b458cf-ff2pt"] Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.328137 4735 scope.go:117] "RemoveContainer" containerID="6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8" Dec 09 15:03:50 crc kubenswrapper[4735]: E1209 15:03:50.328456 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8\": container with ID starting with 6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8 not found: ID does not exist" containerID="6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.328502 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8"} err="failed to get container status \"6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8\": rpc error: code = NotFound desc = could not find container \"6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8\": container with ID starting with 6392d40b371fde438883118aadbed94575ee33ac29c10cc949d38d461c2d6ed8 not found: ID does not exist" Dec 09 15:03:50 crc kubenswrapper[4735]: I1209 15:03:50.331749 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b458cf-ff2pt"] Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.072528 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75d4679d-jwflb"] Dec 09 15:03:51 crc kubenswrapper[4735]: E1209 15:03:51.072709 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df48875-e111-4fa9-ac81-e3ae198c221a" containerName="controller-manager" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.072721 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df48875-e111-4fa9-ac81-e3ae198c221a" containerName="controller-manager" Dec 09 15:03:51 crc kubenswrapper[4735]: E1209 15:03:51.072735 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8238321-a193-4ac5-b027-b8e3f8962ebe" containerName="route-controller-manager" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.072740 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8238321-a193-4ac5-b027-b8e3f8962ebe" containerName="route-controller-manager" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.072819 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df48875-e111-4fa9-ac81-e3ae198c221a" containerName="controller-manager" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.072832 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8238321-a193-4ac5-b027-b8e3f8962ebe" containerName="route-controller-manager" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.073153 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.075457 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk"] Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.076019 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.079898 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.079991 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.080051 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.080142 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.080166 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.080330 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.080949 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.081036 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.081064 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.081232 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.081409 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.081582 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.083371 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk"] Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.085601 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75d4679d-jwflb"] Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.086089 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.089564 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-config\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.089593 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-client-ca\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.089613 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8dcb375-b642-4488-9269-63f95f4338e6-client-ca\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.089686 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-proxy-ca-bundles\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.089781 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv7bn\" (UniqueName: \"kubernetes.io/projected/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-kube-api-access-sv7bn\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.089836 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8dcb375-b642-4488-9269-63f95f4338e6-serving-cert\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.089862 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-serving-cert\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.089884 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nr6x\" (UniqueName: \"kubernetes.io/projected/b8dcb375-b642-4488-9269-63f95f4338e6-kube-api-access-2nr6x\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.089902 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dcb375-b642-4488-9269-63f95f4338e6-config\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.191134 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv7bn\" (UniqueName: \"kubernetes.io/projected/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-kube-api-access-sv7bn\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.191201 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8dcb375-b642-4488-9269-63f95f4338e6-serving-cert\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.191226 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-serving-cert\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.191275 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nr6x\" (UniqueName: \"kubernetes.io/projected/b8dcb375-b642-4488-9269-63f95f4338e6-kube-api-access-2nr6x\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.191293 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dcb375-b642-4488-9269-63f95f4338e6-config\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.191361 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-config\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.191380 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-client-ca\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.192719 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8dcb375-b642-4488-9269-63f95f4338e6-client-ca\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.193478 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-proxy-ca-bundles\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.192580 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-config\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.192257 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-client-ca\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.193423 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b8dcb375-b642-4488-9269-63f95f4338e6-client-ca\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.192691 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dcb375-b642-4488-9269-63f95f4338e6-config\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.194155 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-serving-cert\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.194188 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8dcb375-b642-4488-9269-63f95f4338e6-serving-cert\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.194694 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-proxy-ca-bundles\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.204379 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nr6x\" (UniqueName: \"kubernetes.io/projected/b8dcb375-b642-4488-9269-63f95f4338e6-kube-api-access-2nr6x\") pod \"route-controller-manager-576b7bb8b9-snmbk\" (UID: \"b8dcb375-b642-4488-9269-63f95f4338e6\") " pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.206016 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv7bn\" (UniqueName: \"kubernetes.io/projected/0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d-kube-api-access-sv7bn\") pod \"controller-manager-75d4679d-jwflb\" (UID: \"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d\") " pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.386417 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.390430 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.428229 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df48875-e111-4fa9-ac81-e3ae198c221a" path="/var/lib/kubelet/pods/3df48875-e111-4fa9-ac81-e3ae198c221a/volumes" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.428844 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8238321-a193-4ac5-b027-b8e3f8962ebe" path="/var/lib/kubelet/pods/c8238321-a193-4ac5-b027-b8e3f8962ebe/volumes" Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.606708 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75d4679d-jwflb"] Dec 09 15:03:51 crc kubenswrapper[4735]: I1209 15:03:51.815911 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk"] Dec 09 15:03:51 crc kubenswrapper[4735]: W1209 15:03:51.820184 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8dcb375_b642_4488_9269_63f95f4338e6.slice/crio-5c6cf92852e3bf163ab17071f78cc950e9dd32be91f0966a7bee3b0fa310a9a6 WatchSource:0}: Error finding container 5c6cf92852e3bf163ab17071f78cc950e9dd32be91f0966a7bee3b0fa310a9a6: Status 404 returned error can't find the container with id 5c6cf92852e3bf163ab17071f78cc950e9dd32be91f0966a7bee3b0fa310a9a6 Dec 09 15:03:52 crc kubenswrapper[4735]: I1209 15:03:52.307187 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" event={"ID":"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d","Type":"ContainerStarted","Data":"ba659b674642140fda6760c5af9e4a9ae318b9ff6c9b252290a995cca7f53f02"} Dec 09 15:03:52 crc kubenswrapper[4735]: I1209 15:03:52.307234 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" event={"ID":"0f8b5422-7d0a-4298-8ea0-a8dfc7cc178d","Type":"ContainerStarted","Data":"7bfd3fdd1f83c8f52536f65995e91ae4378f3cdc033bf4abeeed8a2f0f02627b"} Dec 09 15:03:52 crc kubenswrapper[4735]: I1209 15:03:52.307458 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:52 crc kubenswrapper[4735]: I1209 15:03:52.311540 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" event={"ID":"b8dcb375-b642-4488-9269-63f95f4338e6","Type":"ContainerStarted","Data":"596f6881fa2b8a6b831b137139c4faa81456da1bd15743a311c18a18ae06b397"} Dec 09 15:03:52 crc kubenswrapper[4735]: I1209 15:03:52.311565 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" event={"ID":"b8dcb375-b642-4488-9269-63f95f4338e6","Type":"ContainerStarted","Data":"5c6cf92852e3bf163ab17071f78cc950e9dd32be91f0966a7bee3b0fa310a9a6"} Dec 09 15:03:52 crc kubenswrapper[4735]: I1209 15:03:52.311910 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:52 crc kubenswrapper[4735]: I1209 15:03:52.316993 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" Dec 09 15:03:52 crc kubenswrapper[4735]: I1209 15:03:52.328692 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75d4679d-jwflb" podStartSLOduration=3.32867959 podStartE2EDuration="3.32867959s" podCreationTimestamp="2025-12-09 15:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:03:52.327645529 +0000 UTC m=+311.252484157" watchObservedRunningTime="2025-12-09 15:03:52.32867959 +0000 UTC m=+311.253518217" Dec 09 15:03:52 crc kubenswrapper[4735]: I1209 15:03:52.339631 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" podStartSLOduration=3.339618283 podStartE2EDuration="3.339618283s" podCreationTimestamp="2025-12-09 15:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:03:52.338074641 +0000 UTC m=+311.262913269" watchObservedRunningTime="2025-12-09 15:03:52.339618283 +0000 UTC m=+311.264456911" Dec 09 15:03:52 crc kubenswrapper[4735]: I1209 15:03:52.408733 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-576b7bb8b9-snmbk" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.485273 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7hnpf"] Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.486228 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.495554 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7hnpf"] Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.638320 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/837c3585-e612-444b-98eb-37a371744501-registry-tls\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.638370 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/837c3585-e612-444b-98eb-37a371744501-bound-sa-token\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.638425 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.638448 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/837c3585-e612-444b-98eb-37a371744501-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.638468 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvl5k\" (UniqueName: \"kubernetes.io/projected/837c3585-e612-444b-98eb-37a371744501-kube-api-access-pvl5k\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.638603 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/837c3585-e612-444b-98eb-37a371744501-registry-certificates\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.638753 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/837c3585-e612-444b-98eb-37a371744501-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.638786 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/837c3585-e612-444b-98eb-37a371744501-trusted-ca\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.654723 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.739598 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/837c3585-e612-444b-98eb-37a371744501-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.739641 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/837c3585-e612-444b-98eb-37a371744501-trusted-ca\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.739667 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/837c3585-e612-444b-98eb-37a371744501-registry-tls\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.739685 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/837c3585-e612-444b-98eb-37a371744501-bound-sa-token\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.739744 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/837c3585-e612-444b-98eb-37a371744501-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.740017 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/837c3585-e612-444b-98eb-37a371744501-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.740307 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvl5k\" (UniqueName: \"kubernetes.io/projected/837c3585-e612-444b-98eb-37a371744501-kube-api-access-pvl5k\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.740358 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/837c3585-e612-444b-98eb-37a371744501-registry-certificates\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.741316 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/837c3585-e612-444b-98eb-37a371744501-trusted-ca\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.741337 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/837c3585-e612-444b-98eb-37a371744501-registry-certificates\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.748261 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/837c3585-e612-444b-98eb-37a371744501-registry-tls\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.748271 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/837c3585-e612-444b-98eb-37a371744501-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.756913 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvl5k\" (UniqueName: \"kubernetes.io/projected/837c3585-e612-444b-98eb-37a371744501-kube-api-access-pvl5k\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.757289 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/837c3585-e612-444b-98eb-37a371744501-bound-sa-token\") pod \"image-registry-66df7c8f76-7hnpf\" (UID: \"837c3585-e612-444b-98eb-37a371744501\") " pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:55 crc kubenswrapper[4735]: I1209 15:03:55.799262 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:03:56 crc kubenswrapper[4735]: I1209 15:03:56.149727 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7hnpf"] Dec 09 15:03:56 crc kubenswrapper[4735]: I1209 15:03:56.327423 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" event={"ID":"837c3585-e612-444b-98eb-37a371744501","Type":"ContainerStarted","Data":"2ff4530f378213ca6fb4a5a52c6f18ec724f4feb099e7228acea1c89b41fe05e"} Dec 09 15:03:56 crc kubenswrapper[4735]: I1209 15:03:56.327468 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" event={"ID":"837c3585-e612-444b-98eb-37a371744501","Type":"ContainerStarted","Data":"e6225b7873355690a5362ac7f4c0fcd4a4d82d35846b3a99a0a84ca26e8dc3c3"} Dec 09 15:03:56 crc kubenswrapper[4735]: I1209 15:03:56.342645 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" podStartSLOduration=1.342632268 podStartE2EDuration="1.342632268s" podCreationTimestamp="2025-12-09 15:03:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:03:56.339532611 +0000 UTC m=+315.264371249" watchObservedRunningTime="2025-12-09 15:03:56.342632268 +0000 UTC m=+315.267470896" Dec 09 15:03:57 crc kubenswrapper[4735]: I1209 15:03:57.331728 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:04:04 crc kubenswrapper[4735]: I1209 15:04:04.335963 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:04:04 crc kubenswrapper[4735]: I1209 15:04:04.336217 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:04:15 crc kubenswrapper[4735]: I1209 15:04:15.803110 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7hnpf" Dec 09 15:04:15 crc kubenswrapper[4735]: I1209 15:04:15.840383 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gwkvf"] Dec 09 15:04:34 crc kubenswrapper[4735]: I1209 15:04:34.335312 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:04:34 crc kubenswrapper[4735]: I1209 15:04:34.336394 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:04:40 crc kubenswrapper[4735]: I1209 15:04:40.863299 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" podUID="5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" containerName="registry" containerID="cri-o://0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5" gracePeriod=30 Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.165588 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.297378 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-certificates\") pod \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.297433 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-ca-trust-extracted\") pod \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.297649 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.297676 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-trusted-ca\") pod \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.297700 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-installation-pull-secrets\") pod \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.297721 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-bound-sa-token\") pod \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.297747 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzghx\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-kube-api-access-jzghx\") pod \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.297767 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-tls\") pod \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\" (UID: \"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891\") " Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.298174 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.298209 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.301972 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.302043 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.302135 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.302402 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-kube-api-access-jzghx" (OuterVolumeSpecName: "kube-api-access-jzghx") pod "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891"). InnerVolumeSpecName "kube-api-access-jzghx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.304174 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.325508 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" (UID: "5d00084f-8f8b-4cd8-93b6-a05ec4ac5891"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.398412 4735 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.398449 4735 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.398460 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.398469 4735 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.398477 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.398485 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzghx\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-kube-api-access-jzghx\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.398493 4735 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.491981 4735 generic.go:334] "Generic (PLEG): container finished" podID="5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" containerID="0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5" exitCode=0 Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.492018 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" event={"ID":"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891","Type":"ContainerDied","Data":"0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5"} Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.492043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" event={"ID":"5d00084f-8f8b-4cd8-93b6-a05ec4ac5891","Type":"ContainerDied","Data":"9ed1c265781dbd87d9a6bdaa3f19cd4be724674ae1858e5c80b0dcb4f4c9b716"} Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.492058 4735 scope.go:117] "RemoveContainer" containerID="0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.492333 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gwkvf" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.504914 4735 scope.go:117] "RemoveContainer" containerID="0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5" Dec 09 15:04:41 crc kubenswrapper[4735]: E1209 15:04:41.505176 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5\": container with ID starting with 0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5 not found: ID does not exist" containerID="0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.505200 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5"} err="failed to get container status \"0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5\": rpc error: code = NotFound desc = could not find container \"0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5\": container with ID starting with 0a28eef5e244cc2a89b11c859be68076bf08195fd47ec042b6ab70f89ed602d5 not found: ID does not exist" Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.505422 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gwkvf"] Dec 09 15:04:41 crc kubenswrapper[4735]: I1209 15:04:41.507700 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gwkvf"] Dec 09 15:04:43 crc kubenswrapper[4735]: I1209 15:04:43.418851 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" path="/var/lib/kubelet/pods/5d00084f-8f8b-4cd8-93b6-a05ec4ac5891/volumes" Dec 09 15:05:04 crc kubenswrapper[4735]: I1209 15:05:04.335908 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:05:04 crc kubenswrapper[4735]: I1209 15:05:04.336359 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:05:04 crc kubenswrapper[4735]: I1209 15:05:04.336402 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 15:05:04 crc kubenswrapper[4735]: I1209 15:05:04.336826 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04b7c863f3e25aee025e034071815199469e37d7bb0cc98f93e577fecf50982a"} pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:05:04 crc kubenswrapper[4735]: I1209 15:05:04.336885 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" containerID="cri-o://04b7c863f3e25aee025e034071815199469e37d7bb0cc98f93e577fecf50982a" gracePeriod=600 Dec 09 15:05:04 crc kubenswrapper[4735]: I1209 15:05:04.570215 4735 generic.go:334] "Generic (PLEG): container finished" podID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerID="04b7c863f3e25aee025e034071815199469e37d7bb0cc98f93e577fecf50982a" exitCode=0 Dec 09 15:05:04 crc kubenswrapper[4735]: I1209 15:05:04.570288 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerDied","Data":"04b7c863f3e25aee025e034071815199469e37d7bb0cc98f93e577fecf50982a"} Dec 09 15:05:04 crc kubenswrapper[4735]: I1209 15:05:04.570440 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"7283bb739f18015c714dddde96f1ec57c18cf87c97f630bb012cb5c7d38a8190"} Dec 09 15:05:04 crc kubenswrapper[4735]: I1209 15:05:04.570477 4735 scope.go:117] "RemoveContainer" containerID="57526bc24aaad65fc343830aa00f63f5a9608de92b08623aa0d6d74895b3e753" Dec 09 15:07:04 crc kubenswrapper[4735]: I1209 15:07:04.336074 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:07:04 crc kubenswrapper[4735]: I1209 15:07:04.336547 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:07:34 crc kubenswrapper[4735]: I1209 15:07:34.336229 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:07:34 crc kubenswrapper[4735]: I1209 15:07:34.336836 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:08:04 crc kubenswrapper[4735]: I1209 15:08:04.335555 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:08:04 crc kubenswrapper[4735]: I1209 15:08:04.335918 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:08:04 crc kubenswrapper[4735]: I1209 15:08:04.335953 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 15:08:04 crc kubenswrapper[4735]: I1209 15:08:04.336340 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7283bb739f18015c714dddde96f1ec57c18cf87c97f630bb012cb5c7d38a8190"} pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:08:04 crc kubenswrapper[4735]: I1209 15:08:04.336391 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" containerID="cri-o://7283bb739f18015c714dddde96f1ec57c18cf87c97f630bb012cb5c7d38a8190" gracePeriod=600 Dec 09 15:08:05 crc kubenswrapper[4735]: I1209 15:08:05.230472 4735 generic.go:334] "Generic (PLEG): container finished" podID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerID="7283bb739f18015c714dddde96f1ec57c18cf87c97f630bb012cb5c7d38a8190" exitCode=0 Dec 09 15:08:05 crc kubenswrapper[4735]: I1209 15:08:05.230540 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerDied","Data":"7283bb739f18015c714dddde96f1ec57c18cf87c97f630bb012cb5c7d38a8190"} Dec 09 15:08:05 crc kubenswrapper[4735]: I1209 15:08:05.230706 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"32f296cb608e9d91aaf8195ce2837766de47464c288698a32b6b4cd28703999c"} Dec 09 15:08:05 crc kubenswrapper[4735]: I1209 15:08:05.230728 4735 scope.go:117] "RemoveContainer" containerID="04b7c863f3e25aee025e034071815199469e37d7bb0cc98f93e577fecf50982a" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.486681 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr"] Dec 09 15:08:14 crc kubenswrapper[4735]: E1209 15:08:14.487272 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" containerName="registry" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.487283 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" containerName="registry" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.487364 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d00084f-8f8b-4cd8-93b6-a05ec4ac5891" containerName="registry" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.487938 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.489352 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.492041 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr"] Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.668272 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.668314 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnt4q\" (UniqueName: \"kubernetes.io/projected/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-kube-api-access-fnt4q\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.668354 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.768951 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.769037 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.769057 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnt4q\" (UniqueName: \"kubernetes.io/projected/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-kube-api-access-fnt4q\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.769389 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.769464 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.784002 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnt4q\" (UniqueName: \"kubernetes.io/projected/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-kube-api-access-fnt4q\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.799782 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:14 crc kubenswrapper[4735]: I1209 15:08:14.953613 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr"] Dec 09 15:08:15 crc kubenswrapper[4735]: I1209 15:08:15.275908 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" event={"ID":"3a8584cb-b1fa-4d42-b2f8-7b3435a67747","Type":"ContainerStarted","Data":"d21baaeeb18839e015a519228d73a0ff59a63f2d590460e48fe5957010853c81"} Dec 09 15:08:15 crc kubenswrapper[4735]: I1209 15:08:15.276261 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" event={"ID":"3a8584cb-b1fa-4d42-b2f8-7b3435a67747","Type":"ContainerStarted","Data":"b229fc9c8cbbd2e27be6aa6259906d2dc51403609ff46af9416b6375f9314c9f"} Dec 09 15:08:16 crc kubenswrapper[4735]: I1209 15:08:16.280418 4735 generic.go:334] "Generic (PLEG): container finished" podID="3a8584cb-b1fa-4d42-b2f8-7b3435a67747" containerID="d21baaeeb18839e015a519228d73a0ff59a63f2d590460e48fe5957010853c81" exitCode=0 Dec 09 15:08:16 crc kubenswrapper[4735]: I1209 15:08:16.280461 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" event={"ID":"3a8584cb-b1fa-4d42-b2f8-7b3435a67747","Type":"ContainerDied","Data":"d21baaeeb18839e015a519228d73a0ff59a63f2d590460e48fe5957010853c81"} Dec 09 15:08:16 crc kubenswrapper[4735]: I1209 15:08:16.281685 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:08:17 crc kubenswrapper[4735]: I1209 15:08:17.287589 4735 generic.go:334] "Generic (PLEG): container finished" podID="3a8584cb-b1fa-4d42-b2f8-7b3435a67747" containerID="b648936c1a2a0d31eb73dea68aaf0eae2e9494225374425b5d2341ec59758a05" exitCode=0 Dec 09 15:08:17 crc kubenswrapper[4735]: I1209 15:08:17.287963 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" event={"ID":"3a8584cb-b1fa-4d42-b2f8-7b3435a67747","Type":"ContainerDied","Data":"b648936c1a2a0d31eb73dea68aaf0eae2e9494225374425b5d2341ec59758a05"} Dec 09 15:08:18 crc kubenswrapper[4735]: I1209 15:08:18.292991 4735 generic.go:334] "Generic (PLEG): container finished" podID="3a8584cb-b1fa-4d42-b2f8-7b3435a67747" containerID="e9fd308207e02c8b8646252f1c03607122b01fe9af61bac807e6b52a610463f2" exitCode=0 Dec 09 15:08:18 crc kubenswrapper[4735]: I1209 15:08:18.293035 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" event={"ID":"3a8584cb-b1fa-4d42-b2f8-7b3435a67747","Type":"ContainerDied","Data":"e9fd308207e02c8b8646252f1c03607122b01fe9af61bac807e6b52a610463f2"} Dec 09 15:08:19 crc kubenswrapper[4735]: I1209 15:08:19.457919 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:19 crc kubenswrapper[4735]: I1209 15:08:19.623015 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnt4q\" (UniqueName: \"kubernetes.io/projected/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-kube-api-access-fnt4q\") pod \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " Dec 09 15:08:19 crc kubenswrapper[4735]: I1209 15:08:19.623066 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-bundle\") pod \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " Dec 09 15:08:19 crc kubenswrapper[4735]: I1209 15:08:19.623181 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-util\") pod \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\" (UID: \"3a8584cb-b1fa-4d42-b2f8-7b3435a67747\") " Dec 09 15:08:19 crc kubenswrapper[4735]: I1209 15:08:19.624623 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-bundle" (OuterVolumeSpecName: "bundle") pod "3a8584cb-b1fa-4d42-b2f8-7b3435a67747" (UID: "3a8584cb-b1fa-4d42-b2f8-7b3435a67747"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:08:19 crc kubenswrapper[4735]: I1209 15:08:19.628975 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-kube-api-access-fnt4q" (OuterVolumeSpecName: "kube-api-access-fnt4q") pod "3a8584cb-b1fa-4d42-b2f8-7b3435a67747" (UID: "3a8584cb-b1fa-4d42-b2f8-7b3435a67747"). InnerVolumeSpecName "kube-api-access-fnt4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:08:19 crc kubenswrapper[4735]: I1209 15:08:19.634062 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-util" (OuterVolumeSpecName: "util") pod "3a8584cb-b1fa-4d42-b2f8-7b3435a67747" (UID: "3a8584cb-b1fa-4d42-b2f8-7b3435a67747"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:08:19 crc kubenswrapper[4735]: I1209 15:08:19.724821 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-util\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:19 crc kubenswrapper[4735]: I1209 15:08:19.724845 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnt4q\" (UniqueName: \"kubernetes.io/projected/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-kube-api-access-fnt4q\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:19 crc kubenswrapper[4735]: I1209 15:08:19.724857 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a8584cb-b1fa-4d42-b2f8-7b3435a67747-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:20 crc kubenswrapper[4735]: I1209 15:08:20.301759 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" event={"ID":"3a8584cb-b1fa-4d42-b2f8-7b3435a67747","Type":"ContainerDied","Data":"b229fc9c8cbbd2e27be6aa6259906d2dc51403609ff46af9416b6375f9314c9f"} Dec 09 15:08:20 crc kubenswrapper[4735]: I1209 15:08:20.302126 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b229fc9c8cbbd2e27be6aa6259906d2dc51403609ff46af9416b6375f9314c9f" Dec 09 15:08:20 crc kubenswrapper[4735]: I1209 15:08:20.301836 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.607772 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qblcd"] Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.608278 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovn-controller" containerID="cri-o://bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea" gracePeriod=30 Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.608321 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="northd" containerID="cri-o://9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2" gracePeriod=30 Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.608370 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovn-acl-logging" containerID="cri-o://711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1" gracePeriod=30 Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.608372 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="nbdb" containerID="cri-o://8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0" gracePeriod=30 Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.608409 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596" gracePeriod=30 Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.608383 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="sbdb" containerID="cri-o://1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b" gracePeriod=30 Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.608322 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="kube-rbac-proxy-node" containerID="cri-o://38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0" gracePeriod=30 Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.642942 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" containerID="cri-o://3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c" gracePeriod=30 Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.873169 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/3.log" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.875615 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovn-acl-logging/0.log" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.876086 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovn-controller/0.log" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.876450 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923204 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pf92r"] Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923412 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="sbdb" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923429 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="sbdb" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923437 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a8584cb-b1fa-4d42-b2f8-7b3435a67747" containerName="extract" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923444 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8584cb-b1fa-4d42-b2f8-7b3435a67747" containerName="extract" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923451 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="northd" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923456 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="northd" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923462 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923469 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923476 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="kubecfg-setup" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923481 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="kubecfg-setup" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923487 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923492 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923497 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="nbdb" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923502 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="nbdb" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923524 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovn-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923530 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovn-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923537 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a8584cb-b1fa-4d42-b2f8-7b3435a67747" containerName="util" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923542 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8584cb-b1fa-4d42-b2f8-7b3435a67747" containerName="util" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923548 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a8584cb-b1fa-4d42-b2f8-7b3435a67747" containerName="pull" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923553 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a8584cb-b1fa-4d42-b2f8-7b3435a67747" containerName="pull" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923562 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovn-acl-logging" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923569 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovn-acl-logging" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923575 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923580 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923586 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923591 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923599 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="kube-rbac-proxy-node" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923604 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="kube-rbac-proxy-node" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923684 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovn-acl-logging" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923692 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="kube-rbac-proxy-node" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923710 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="northd" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923715 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923722 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923728 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a8584cb-b1fa-4d42-b2f8-7b3435a67747" containerName="extract" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923734 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="sbdb" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923741 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923750 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovn-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923757 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="nbdb" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923763 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="kube-rbac-proxy-ovn-metrics" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923769 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.923849 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923855 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.923931 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: E1209 15:08:26.924008 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.924014 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374566a-4662-4e98-ae18-6f52468332b5" containerName="ovnkube-controller" Dec 09 15:08:26 crc kubenswrapper[4735]: I1209 15:08:26.925169 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.006813 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-config\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.006863 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-systemd-units\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.006895 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.006936 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.006992 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-ovn-kubernetes\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007032 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007050 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007121 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-log-socket\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007166 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-log-socket" (OuterVolumeSpecName: "log-socket") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007266 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007281 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-script-lib\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007583 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-bin\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007625 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007638 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn6dw\" (UniqueName: \"kubernetes.io/projected/9374566a-4662-4e98-ae18-6f52468332b5-kube-api-access-fn6dw\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007660 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-kubelet\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007690 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9374566a-4662-4e98-ae18-6f52468332b5-ovn-node-metrics-cert\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007728 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-netns\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007744 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007750 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-var-lib-openvswitch\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007773 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007782 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-ovn\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007804 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-systemd\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007823 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-openvswitch\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007803 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007821 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007859 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007839 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-netd\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007973 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-slash\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007999 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-node-log\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007875 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008031 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-env-overrides\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008054 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-etc-openvswitch\") pod \"9374566a-4662-4e98-ae18-6f52468332b5\" (UID: \"9374566a-4662-4e98-ae18-6f52468332b5\") " Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008064 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-node-log" (OuterVolumeSpecName: "node-log") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.007860 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008002 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-slash" (OuterVolumeSpecName: "host-slash") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008162 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008573 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz8dm\" (UniqueName: \"kubernetes.io/projected/9faa308c-34e4-40c7-996f-a8087be1b78c-kube-api-access-hz8dm\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008632 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-run-openvswitch\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008663 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9faa308c-34e4-40c7-996f-a8087be1b78c-ovn-node-metrics-cert\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008685 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-kubelet\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008719 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-run-systemd\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008778 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-var-lib-openvswitch\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008833 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-run-netns\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008864 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9faa308c-34e4-40c7-996f-a8087be1b78c-env-overrides\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.008898 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-node-log\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009096 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9faa308c-34e4-40c7-996f-a8087be1b78c-ovnkube-config\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009120 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-log-socket\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009159 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-etc-openvswitch\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009189 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-cni-bin\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009210 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9faa308c-34e4-40c7-996f-a8087be1b78c-ovnkube-script-lib\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009228 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-systemd-units\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009252 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-slash\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009279 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009301 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-run-ovn\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009328 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-cni-netd\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009348 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009461 4735 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009477 4735 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009494 4735 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009505 4735 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009529 4735 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009538 4735 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-node-log\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009549 4735 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-slash\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009557 4735 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009567 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009576 4735 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009588 4735 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009598 4735 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009607 4735 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-log-socket\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009618 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009631 4735 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009638 4735 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.009770 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.019538 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9374566a-4662-4e98-ae18-6f52468332b5-kube-api-access-fn6dw" (OuterVolumeSpecName: "kube-api-access-fn6dw") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "kube-api-access-fn6dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.021408 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.021765 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9374566a-4662-4e98-ae18-6f52468332b5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "9374566a-4662-4e98-ae18-6f52468332b5" (UID: "9374566a-4662-4e98-ae18-6f52468332b5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110215 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz8dm\" (UniqueName: \"kubernetes.io/projected/9faa308c-34e4-40c7-996f-a8087be1b78c-kube-api-access-hz8dm\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-run-openvswitch\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110279 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9faa308c-34e4-40c7-996f-a8087be1b78c-ovn-node-metrics-cert\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110295 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-kubelet\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110317 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-run-systemd\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110334 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-var-lib-openvswitch\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110350 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-run-netns\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110367 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9faa308c-34e4-40c7-996f-a8087be1b78c-env-overrides\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110363 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-run-openvswitch\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110383 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-node-log\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110406 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9faa308c-34e4-40c7-996f-a8087be1b78c-ovnkube-config\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110426 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-log-socket\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110447 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-var-lib-openvswitch\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110454 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-etc-openvswitch\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110472 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-cni-bin\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110470 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-kubelet\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110491 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9faa308c-34e4-40c7-996f-a8087be1b78c-ovnkube-script-lib\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110507 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-systemd-units\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110543 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-slash\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110567 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110584 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-run-ovn\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110604 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110622 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-cni-netd\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110657 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9374566a-4662-4e98-ae18-6f52468332b5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110669 4735 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9374566a-4662-4e98-ae18-6f52468332b5-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110678 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9374566a-4662-4e98-ae18-6f52468332b5-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110687 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn6dw\" (UniqueName: \"kubernetes.io/projected/9374566a-4662-4e98-ae18-6f52468332b5-kube-api-access-fn6dw\") on node \"crc\" DevicePath \"\"" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110729 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-cni-netd\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110759 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-systemd-units\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110780 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-slash\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110791 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-etc-openvswitch\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110815 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-run-ovn\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110806 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-node-log\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110425 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-run-netns\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110840 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110861 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-host-cni-bin\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110844 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-log-socket\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.110443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9faa308c-34e4-40c7-996f-a8087be1b78c-run-systemd\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.111079 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9faa308c-34e4-40c7-996f-a8087be1b78c-env-overrides\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.111087 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9faa308c-34e4-40c7-996f-a8087be1b78c-ovnkube-config\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.111296 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9faa308c-34e4-40c7-996f-a8087be1b78c-ovnkube-script-lib\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.114213 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9faa308c-34e4-40c7-996f-a8087be1b78c-ovn-node-metrics-cert\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.125301 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz8dm\" (UniqueName: \"kubernetes.io/projected/9faa308c-34e4-40c7-996f-a8087be1b78c-kube-api-access-hz8dm\") pod \"ovnkube-node-pf92r\" (UID: \"9faa308c-34e4-40c7-996f-a8087be1b78c\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.237022 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.328068 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovnkube-controller/3.log" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330042 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovn-acl-logging/0.log" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330432 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qblcd_9374566a-4662-4e98-ae18-6f52468332b5/ovn-controller/0.log" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330725 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c" exitCode=0 Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330757 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b" exitCode=0 Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330766 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0" exitCode=0 Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330774 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2" exitCode=0 Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330782 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596" exitCode=0 Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330789 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0" exitCode=0 Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330800 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1" exitCode=143 Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330807 4735 generic.go:334] "Generic (PLEG): container finished" podID="9374566a-4662-4e98-ae18-6f52468332b5" containerID="bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea" exitCode=143 Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330855 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330891 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330902 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330913 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330921 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330934 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330945 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330959 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330965 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330972 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330978 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330982 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330988 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330992 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.330997 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331004 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331012 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331018 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331022 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331028 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331032 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331037 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331041 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331046 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331050 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331055 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331061 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331068 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331075 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331081 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331087 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331094 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331100 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331106 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331113 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331118 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331124 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331132 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" event={"ID":"9374566a-4662-4e98-ae18-6f52468332b5","Type":"ContainerDied","Data":"d35f7d254aa9fb6aba258457e3c45e02bf100e9bc3af5cb8d9fb6f718ec6db12"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331139 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331145 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331149 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331154 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331159 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331165 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331169 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331174 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331179 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331183 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331199 4735 scope.go:117] "RemoveContainer" containerID="3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.331369 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qblcd" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.333035 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" event={"ID":"9faa308c-34e4-40c7-996f-a8087be1b78c","Type":"ContainerStarted","Data":"dbb732cce0c91e470baecec66f7c7f67851e2702664466aa80bff4e2f52ab41c"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.334582 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnf8f_67d17a09-b547-49cf-8195-5af12413f51c/kube-multus/2.log" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.334927 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnf8f_67d17a09-b547-49cf-8195-5af12413f51c/kube-multus/1.log" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.334987 4735 generic.go:334] "Generic (PLEG): container finished" podID="67d17a09-b547-49cf-8195-5af12413f51c" containerID="88bc3bb0b0d1327a3335aadac40c46ac49a79d37f1e1436ccb892cbaa982f40d" exitCode=2 Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.335045 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnf8f" event={"ID":"67d17a09-b547-49cf-8195-5af12413f51c","Type":"ContainerDied","Data":"88bc3bb0b0d1327a3335aadac40c46ac49a79d37f1e1436ccb892cbaa982f40d"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.335072 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed"} Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.335529 4735 scope.go:117] "RemoveContainer" containerID="88bc3bb0b0d1327a3335aadac40c46ac49a79d37f1e1436ccb892cbaa982f40d" Dec 09 15:08:27 crc kubenswrapper[4735]: E1209 15:08:27.335684 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xnf8f_openshift-multus(67d17a09-b547-49cf-8195-5af12413f51c)\"" pod="openshift-multus/multus-xnf8f" podUID="67d17a09-b547-49cf-8195-5af12413f51c" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.395005 4735 scope.go:117] "RemoveContainer" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.403345 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qblcd"] Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.408549 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qblcd"] Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.413308 4735 scope.go:117] "RemoveContainer" containerID="1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.419038 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9374566a-4662-4e98-ae18-6f52468332b5" path="/var/lib/kubelet/pods/9374566a-4662-4e98-ae18-6f52468332b5/volumes" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.431866 4735 scope.go:117] "RemoveContainer" containerID="8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.442370 4735 scope.go:117] "RemoveContainer" containerID="9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.451387 4735 scope.go:117] "RemoveContainer" containerID="bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.460319 4735 scope.go:117] "RemoveContainer" containerID="38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.469939 4735 scope.go:117] "RemoveContainer" containerID="711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.478900 4735 scope.go:117] "RemoveContainer" containerID="bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.491801 4735 scope.go:117] "RemoveContainer" containerID="62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.504580 4735 scope.go:117] "RemoveContainer" containerID="3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c" Dec 09 15:08:27 crc kubenswrapper[4735]: E1209 15:08:27.505036 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c\": container with ID starting with 3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c not found: ID does not exist" containerID="3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.505079 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c"} err="failed to get container status \"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c\": rpc error: code = NotFound desc = could not find container \"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c\": container with ID starting with 3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.505111 4735 scope.go:117] "RemoveContainer" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 15:08:27 crc kubenswrapper[4735]: E1209 15:08:27.505984 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\": container with ID starting with c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6 not found: ID does not exist" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.506012 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6"} err="failed to get container status \"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\": rpc error: code = NotFound desc = could not find container \"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\": container with ID starting with c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.506031 4735 scope.go:117] "RemoveContainer" containerID="1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b" Dec 09 15:08:27 crc kubenswrapper[4735]: E1209 15:08:27.506289 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\": container with ID starting with 1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b not found: ID does not exist" containerID="1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.506318 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b"} err="failed to get container status \"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\": rpc error: code = NotFound desc = could not find container \"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\": container with ID starting with 1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.506336 4735 scope.go:117] "RemoveContainer" containerID="8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0" Dec 09 15:08:27 crc kubenswrapper[4735]: E1209 15:08:27.506686 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\": container with ID starting with 8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0 not found: ID does not exist" containerID="8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.506717 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0"} err="failed to get container status \"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\": rpc error: code = NotFound desc = could not find container \"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\": container with ID starting with 8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.506730 4735 scope.go:117] "RemoveContainer" containerID="9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2" Dec 09 15:08:27 crc kubenswrapper[4735]: E1209 15:08:27.506940 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\": container with ID starting with 9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2 not found: ID does not exist" containerID="9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.506960 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2"} err="failed to get container status \"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\": rpc error: code = NotFound desc = could not find container \"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\": container with ID starting with 9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.506972 4735 scope.go:117] "RemoveContainer" containerID="bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596" Dec 09 15:08:27 crc kubenswrapper[4735]: E1209 15:08:27.507176 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\": container with ID starting with bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596 not found: ID does not exist" containerID="bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.507202 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596"} err="failed to get container status \"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\": rpc error: code = NotFound desc = could not find container \"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\": container with ID starting with bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.507217 4735 scope.go:117] "RemoveContainer" containerID="38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0" Dec 09 15:08:27 crc kubenswrapper[4735]: E1209 15:08:27.507459 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\": container with ID starting with 38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0 not found: ID does not exist" containerID="38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.507485 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0"} err="failed to get container status \"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\": rpc error: code = NotFound desc = could not find container \"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\": container with ID starting with 38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.507498 4735 scope.go:117] "RemoveContainer" containerID="711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1" Dec 09 15:08:27 crc kubenswrapper[4735]: E1209 15:08:27.507791 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\": container with ID starting with 711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1 not found: ID does not exist" containerID="711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.507813 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1"} err="failed to get container status \"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\": rpc error: code = NotFound desc = could not find container \"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\": container with ID starting with 711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.507826 4735 scope.go:117] "RemoveContainer" containerID="bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea" Dec 09 15:08:27 crc kubenswrapper[4735]: E1209 15:08:27.508024 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\": container with ID starting with bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea not found: ID does not exist" containerID="bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.508044 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea"} err="failed to get container status \"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\": rpc error: code = NotFound desc = could not find container \"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\": container with ID starting with bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.508056 4735 scope.go:117] "RemoveContainer" containerID="62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4" Dec 09 15:08:27 crc kubenswrapper[4735]: E1209 15:08:27.508286 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\": container with ID starting with 62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4 not found: ID does not exist" containerID="62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.508303 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4"} err="failed to get container status \"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\": rpc error: code = NotFound desc = could not find container \"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\": container with ID starting with 62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.508319 4735 scope.go:117] "RemoveContainer" containerID="3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.508642 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c"} err="failed to get container status \"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c\": rpc error: code = NotFound desc = could not find container \"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c\": container with ID starting with 3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.508664 4735 scope.go:117] "RemoveContainer" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.508892 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6"} err="failed to get container status \"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\": rpc error: code = NotFound desc = could not find container \"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\": container with ID starting with c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.508911 4735 scope.go:117] "RemoveContainer" containerID="1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.509114 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b"} err="failed to get container status \"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\": rpc error: code = NotFound desc = could not find container \"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\": container with ID starting with 1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.509132 4735 scope.go:117] "RemoveContainer" containerID="8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.509316 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0"} err="failed to get container status \"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\": rpc error: code = NotFound desc = could not find container \"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\": container with ID starting with 8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.509334 4735 scope.go:117] "RemoveContainer" containerID="9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.509540 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2"} err="failed to get container status \"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\": rpc error: code = NotFound desc = could not find container \"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\": container with ID starting with 9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.509557 4735 scope.go:117] "RemoveContainer" containerID="bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.509759 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596"} err="failed to get container status \"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\": rpc error: code = NotFound desc = could not find container \"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\": container with ID starting with bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.509776 4735 scope.go:117] "RemoveContainer" containerID="38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.509970 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0"} err="failed to get container status \"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\": rpc error: code = NotFound desc = could not find container \"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\": container with ID starting with 38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.509989 4735 scope.go:117] "RemoveContainer" containerID="711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.510198 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1"} err="failed to get container status \"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\": rpc error: code = NotFound desc = could not find container \"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\": container with ID starting with 711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.510217 4735 scope.go:117] "RemoveContainer" containerID="bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.510414 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea"} err="failed to get container status \"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\": rpc error: code = NotFound desc = could not find container \"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\": container with ID starting with bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.510431 4735 scope.go:117] "RemoveContainer" containerID="62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.510632 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4"} err="failed to get container status \"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\": rpc error: code = NotFound desc = could not find container \"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\": container with ID starting with 62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.510649 4735 scope.go:117] "RemoveContainer" containerID="3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.510859 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c"} err="failed to get container status \"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c\": rpc error: code = NotFound desc = could not find container \"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c\": container with ID starting with 3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.510877 4735 scope.go:117] "RemoveContainer" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.511078 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6"} err="failed to get container status \"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\": rpc error: code = NotFound desc = could not find container \"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\": container with ID starting with c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.511096 4735 scope.go:117] "RemoveContainer" containerID="1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.511285 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b"} err="failed to get container status \"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\": rpc error: code = NotFound desc = could not find container \"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\": container with ID starting with 1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.511306 4735 scope.go:117] "RemoveContainer" containerID="8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.511493 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0"} err="failed to get container status \"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\": rpc error: code = NotFound desc = could not find container \"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\": container with ID starting with 8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.511527 4735 scope.go:117] "RemoveContainer" containerID="9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.518587 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2"} err="failed to get container status \"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\": rpc error: code = NotFound desc = could not find container \"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\": container with ID starting with 9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.518611 4735 scope.go:117] "RemoveContainer" containerID="bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.518931 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596"} err="failed to get container status \"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\": rpc error: code = NotFound desc = could not find container \"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\": container with ID starting with bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.518991 4735 scope.go:117] "RemoveContainer" containerID="38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.519329 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0"} err="failed to get container status \"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\": rpc error: code = NotFound desc = could not find container \"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\": container with ID starting with 38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.519355 4735 scope.go:117] "RemoveContainer" containerID="711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.519959 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1"} err="failed to get container status \"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\": rpc error: code = NotFound desc = could not find container \"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\": container with ID starting with 711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.519984 4735 scope.go:117] "RemoveContainer" containerID="bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.520198 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea"} err="failed to get container status \"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\": rpc error: code = NotFound desc = could not find container \"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\": container with ID starting with bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.520222 4735 scope.go:117] "RemoveContainer" containerID="62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.520436 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4"} err="failed to get container status \"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\": rpc error: code = NotFound desc = could not find container \"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\": container with ID starting with 62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.520457 4735 scope.go:117] "RemoveContainer" containerID="3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.520696 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c"} err="failed to get container status \"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c\": rpc error: code = NotFound desc = could not find container \"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c\": container with ID starting with 3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.520732 4735 scope.go:117] "RemoveContainer" containerID="c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.520955 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6"} err="failed to get container status \"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\": rpc error: code = NotFound desc = could not find container \"c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6\": container with ID starting with c4269a811f38154e0d33db676fe383037fccc831f4a0b3c0bad956461f21dba6 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.520977 4735 scope.go:117] "RemoveContainer" containerID="1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.521169 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b"} err="failed to get container status \"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\": rpc error: code = NotFound desc = could not find container \"1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b\": container with ID starting with 1b943dddc62a59381a6c5ac43126c54f1727305f608eccbf7ffb34a9b6a3063b not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.521190 4735 scope.go:117] "RemoveContainer" containerID="8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.521387 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0"} err="failed to get container status \"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\": rpc error: code = NotFound desc = could not find container \"8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0\": container with ID starting with 8f7e97e979a035f8401bdecab23adaa21da46beb6c14f600964351af7449d2a0 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.521405 4735 scope.go:117] "RemoveContainer" containerID="9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.521611 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2"} err="failed to get container status \"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\": rpc error: code = NotFound desc = could not find container \"9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2\": container with ID starting with 9747f83797f110782d9b374d3d7fa4323aaf9caf2fac2d103aa10b811a2382f2 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.521629 4735 scope.go:117] "RemoveContainer" containerID="bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.526734 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596"} err="failed to get container status \"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\": rpc error: code = NotFound desc = could not find container \"bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596\": container with ID starting with bfd823c4276d869ac3b4205edbfbdd78a7e220b25f2b616c1d1f66a799525596 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.526756 4735 scope.go:117] "RemoveContainer" containerID="38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.526995 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0"} err="failed to get container status \"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\": rpc error: code = NotFound desc = could not find container \"38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0\": container with ID starting with 38e8f556366775f6c3651d24c456dfbc9f36ac29800a320b6c292d74cfebf3d0 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.527023 4735 scope.go:117] "RemoveContainer" containerID="711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.527254 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1"} err="failed to get container status \"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\": rpc error: code = NotFound desc = could not find container \"711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1\": container with ID starting with 711508ffec77a17fb353a433e055dfba3dc7ae760a7bcd3adcd9df0f31a037e1 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.527280 4735 scope.go:117] "RemoveContainer" containerID="bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.527539 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea"} err="failed to get container status \"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\": rpc error: code = NotFound desc = could not find container \"bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea\": container with ID starting with bc2355b9e0d0b92f0a7ef4d28e26c3bb85bc1bb46f19a128b4befa6fae06fdea not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.527559 4735 scope.go:117] "RemoveContainer" containerID="62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.527792 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4"} err="failed to get container status \"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\": rpc error: code = NotFound desc = could not find container \"62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4\": container with ID starting with 62b008c04da042448e0c90e44c441d3cc159e7005489b176b8e99bb59df927d4 not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.527806 4735 scope.go:117] "RemoveContainer" containerID="3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.528000 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c"} err="failed to get container status \"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c\": rpc error: code = NotFound desc = could not find container \"3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c\": container with ID starting with 3a49857adf535ba0f6b13a189d36ff223ad3b8dbd775083855b417d2a85b528c not found: ID does not exist" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.754645 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47"] Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.755731 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.756855 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-2ktlt" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.757109 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.757290 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.821982 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s999v\" (UniqueName: \"kubernetes.io/projected/bfe62381-3825-4204-bbf4-8970225de2c4-kube-api-access-s999v\") pod \"obo-prometheus-operator-668cf9dfbb-nqm47\" (UID: \"bfe62381-3825-4204-bbf4-8970225de2c4\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.875804 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8"] Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.876677 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.881855 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp"] Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.881948 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.882147 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qljt6" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.882397 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.923681 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0007731d-b209-44eb-b95a-5b3b95a02ac2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp\" (UID: \"0007731d-b209-44eb-b95a-5b3b95a02ac2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.923732 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/205460a8-cdae-43e1-8b9c-123f7f4f8c29-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8\" (UID: \"205460a8-cdae-43e1-8b9c-123f7f4f8c29\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.923761 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s999v\" (UniqueName: \"kubernetes.io/projected/bfe62381-3825-4204-bbf4-8970225de2c4-kube-api-access-s999v\") pod \"obo-prometheus-operator-668cf9dfbb-nqm47\" (UID: \"bfe62381-3825-4204-bbf4-8970225de2c4\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.923784 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0007731d-b209-44eb-b95a-5b3b95a02ac2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp\" (UID: \"0007731d-b209-44eb-b95a-5b3b95a02ac2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.923813 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/205460a8-cdae-43e1-8b9c-123f7f4f8c29-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8\" (UID: \"205460a8-cdae-43e1-8b9c-123f7f4f8c29\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.936940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s999v\" (UniqueName: \"kubernetes.io/projected/bfe62381-3825-4204-bbf4-8970225de2c4-kube-api-access-s999v\") pod \"obo-prometheus-operator-668cf9dfbb-nqm47\" (UID: \"bfe62381-3825-4204-bbf4-8970225de2c4\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.975449 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-h5bn2"] Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.976121 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.979842 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-brxld" Dec 09 15:08:27 crc kubenswrapper[4735]: I1209 15:08:27.980206 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.024183 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/205460a8-cdae-43e1-8b9c-123f7f4f8c29-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8\" (UID: \"205460a8-cdae-43e1-8b9c-123f7f4f8c29\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.024218 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx5mv\" (UniqueName: \"kubernetes.io/projected/63cae057-68b9-4d57-a64e-fd9314da6cfd-kube-api-access-lx5mv\") pod \"observability-operator-d8bb48f5d-h5bn2\" (UID: \"63cae057-68b9-4d57-a64e-fd9314da6cfd\") " pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.024248 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/63cae057-68b9-4d57-a64e-fd9314da6cfd-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-h5bn2\" (UID: \"63cae057-68b9-4d57-a64e-fd9314da6cfd\") " pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.024278 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0007731d-b209-44eb-b95a-5b3b95a02ac2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp\" (UID: \"0007731d-b209-44eb-b95a-5b3b95a02ac2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.024371 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/205460a8-cdae-43e1-8b9c-123f7f4f8c29-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8\" (UID: \"205460a8-cdae-43e1-8b9c-123f7f4f8c29\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.024441 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0007731d-b209-44eb-b95a-5b3b95a02ac2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp\" (UID: \"0007731d-b209-44eb-b95a-5b3b95a02ac2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.026743 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0007731d-b209-44eb-b95a-5b3b95a02ac2-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp\" (UID: \"0007731d-b209-44eb-b95a-5b3b95a02ac2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.027035 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0007731d-b209-44eb-b95a-5b3b95a02ac2-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp\" (UID: \"0007731d-b209-44eb-b95a-5b3b95a02ac2\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.027229 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/205460a8-cdae-43e1-8b9c-123f7f4f8c29-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8\" (UID: \"205460a8-cdae-43e1-8b9c-123f7f4f8c29\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.027480 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/205460a8-cdae-43e1-8b9c-123f7f4f8c29-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8\" (UID: \"205460a8-cdae-43e1-8b9c-123f7f4f8c29\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.065870 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.090712 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-w2xlx"] Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.091277 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:28 crc kubenswrapper[4735]: W1209 15:08:28.092971 4735 reflector.go:561] object-"openshift-operators"/"perses-operator-dockercfg-f4rnb": failed to list *v1.Secret: secrets "perses-operator-dockercfg-f4rnb" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.093004 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"perses-operator-dockercfg-f4rnb\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"perses-operator-dockercfg-f4rnb\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.098860 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(778de6f8bf77bf991680cc408d4487bdde1f4b9a8a49319c7f8fba21a2ac5776): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.098900 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(778de6f8bf77bf991680cc408d4487bdde1f4b9a8a49319c7f8fba21a2ac5776): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.098919 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(778de6f8bf77bf991680cc408d4487bdde1f4b9a8a49319c7f8fba21a2ac5776): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.098952 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators(bfe62381-3825-4204-bbf4-8970225de2c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators(bfe62381-3825-4204-bbf4-8970225de2c4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(778de6f8bf77bf991680cc408d4487bdde1f4b9a8a49319c7f8fba21a2ac5776): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" podUID="bfe62381-3825-4204-bbf4-8970225de2c4" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.125540 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx5mv\" (UniqueName: \"kubernetes.io/projected/63cae057-68b9-4d57-a64e-fd9314da6cfd-kube-api-access-lx5mv\") pod \"observability-operator-d8bb48f5d-h5bn2\" (UID: \"63cae057-68b9-4d57-a64e-fd9314da6cfd\") " pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.125580 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/63cae057-68b9-4d57-a64e-fd9314da6cfd-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-h5bn2\" (UID: \"63cae057-68b9-4d57-a64e-fd9314da6cfd\") " pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.125599 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zq8s\" (UniqueName: \"kubernetes.io/projected/2c5b3de4-5006-4a83-b672-1fa5f2bf2cec-kube-api-access-9zq8s\") pod \"perses-operator-5446b9c989-w2xlx\" (UID: \"2c5b3de4-5006-4a83-b672-1fa5f2bf2cec\") " pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.125627 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c5b3de4-5006-4a83-b672-1fa5f2bf2cec-openshift-service-ca\") pod \"perses-operator-5446b9c989-w2xlx\" (UID: \"2c5b3de4-5006-4a83-b672-1fa5f2bf2cec\") " pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.128503 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/63cae057-68b9-4d57-a64e-fd9314da6cfd-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-h5bn2\" (UID: \"63cae057-68b9-4d57-a64e-fd9314da6cfd\") " pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.139885 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx5mv\" (UniqueName: \"kubernetes.io/projected/63cae057-68b9-4d57-a64e-fd9314da6cfd-kube-api-access-lx5mv\") pod \"observability-operator-d8bb48f5d-h5bn2\" (UID: \"63cae057-68b9-4d57-a64e-fd9314da6cfd\") " pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.187786 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.195054 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.205693 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(0c0b1cb5641cb3d05fdbcc21cd60c8fc569c437f6073a1f7487c6b9e1b0e505c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.205762 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(0c0b1cb5641cb3d05fdbcc21cd60c8fc569c437f6073a1f7487c6b9e1b0e505c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.205787 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(0c0b1cb5641cb3d05fdbcc21cd60c8fc569c437f6073a1f7487c6b9e1b0e505c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.205841 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators(205460a8-cdae-43e1-8b9c-123f7f4f8c29)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators(205460a8-cdae-43e1-8b9c-123f7f4f8c29)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(0c0b1cb5641cb3d05fdbcc21cd60c8fc569c437f6073a1f7487c6b9e1b0e505c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" podUID="205460a8-cdae-43e1-8b9c-123f7f4f8c29" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.213595 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(e4df96221beb6da8cd3ec428ba00a6db14cd75945cc4b147c21a1e65cecf3f48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.213639 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(e4df96221beb6da8cd3ec428ba00a6db14cd75945cc4b147c21a1e65cecf3f48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.213667 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(e4df96221beb6da8cd3ec428ba00a6db14cd75945cc4b147c21a1e65cecf3f48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.213721 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators(0007731d-b209-44eb-b95a-5b3b95a02ac2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators(0007731d-b209-44eb-b95a-5b3b95a02ac2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(e4df96221beb6da8cd3ec428ba00a6db14cd75945cc4b147c21a1e65cecf3f48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" podUID="0007731d-b209-44eb-b95a-5b3b95a02ac2" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.226052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c5b3de4-5006-4a83-b672-1fa5f2bf2cec-openshift-service-ca\") pod \"perses-operator-5446b9c989-w2xlx\" (UID: \"2c5b3de4-5006-4a83-b672-1fa5f2bf2cec\") " pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.226147 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zq8s\" (UniqueName: \"kubernetes.io/projected/2c5b3de4-5006-4a83-b672-1fa5f2bf2cec-kube-api-access-9zq8s\") pod \"perses-operator-5446b9c989-w2xlx\" (UID: \"2c5b3de4-5006-4a83-b672-1fa5f2bf2cec\") " pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.227104 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c5b3de4-5006-4a83-b672-1fa5f2bf2cec-openshift-service-ca\") pod \"perses-operator-5446b9c989-w2xlx\" (UID: \"2c5b3de4-5006-4a83-b672-1fa5f2bf2cec\") " pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.242588 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zq8s\" (UniqueName: \"kubernetes.io/projected/2c5b3de4-5006-4a83-b672-1fa5f2bf2cec-kube-api-access-9zq8s\") pod \"perses-operator-5446b9c989-w2xlx\" (UID: \"2c5b3de4-5006-4a83-b672-1fa5f2bf2cec\") " pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.287876 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.303657 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(d0e6791de66acb992379b287cbe53712a61f4430e0d62bef0f6e1df07925c54f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.303724 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(d0e6791de66acb992379b287cbe53712a61f4430e0d62bef0f6e1df07925c54f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.303752 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(d0e6791de66acb992379b287cbe53712a61f4430e0d62bef0f6e1df07925c54f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:28 crc kubenswrapper[4735]: E1209 15:08:28.303794 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-h5bn2_openshift-operators(63cae057-68b9-4d57-a64e-fd9314da6cfd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-h5bn2_openshift-operators(63cae057-68b9-4d57-a64e-fd9314da6cfd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(d0e6791de66acb992379b287cbe53712a61f4430e0d62bef0f6e1df07925c54f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" podUID="63cae057-68b9-4d57-a64e-fd9314da6cfd" Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.341833 4735 generic.go:334] "Generic (PLEG): container finished" podID="9faa308c-34e4-40c7-996f-a8087be1b78c" containerID="5f85e5de2eb874a032e910c50464a52a1a3c6252b890881e94f8d9d7ac8d0e8b" exitCode=0 Dec 09 15:08:28 crc kubenswrapper[4735]: I1209 15:08:28.341879 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" event={"ID":"9faa308c-34e4-40c7-996f-a8087be1b78c","Type":"ContainerDied","Data":"5f85e5de2eb874a032e910c50464a52a1a3c6252b890881e94f8d9d7ac8d0e8b"} Dec 09 15:08:29 crc kubenswrapper[4735]: I1209 15:08:29.348804 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" event={"ID":"9faa308c-34e4-40c7-996f-a8087be1b78c","Type":"ContainerStarted","Data":"004050159b22344cf6f8b7c9cc8fe74aaeb01c21a29177fe24a6bb94fcf77993"} Dec 09 15:08:29 crc kubenswrapper[4735]: I1209 15:08:29.349034 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" event={"ID":"9faa308c-34e4-40c7-996f-a8087be1b78c","Type":"ContainerStarted","Data":"b8e2020a690b008b46a72789aa4e0a7f30823db6d9e39dd96c0daf1cdf7702ab"} Dec 09 15:08:29 crc kubenswrapper[4735]: I1209 15:08:29.349045 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" event={"ID":"9faa308c-34e4-40c7-996f-a8087be1b78c","Type":"ContainerStarted","Data":"6a9aec81fe2b3ee7f33ade429b9871c0e447c07d2d2359d72d633a0c14b31a10"} Dec 09 15:08:29 crc kubenswrapper[4735]: I1209 15:08:29.349053 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" event={"ID":"9faa308c-34e4-40c7-996f-a8087be1b78c","Type":"ContainerStarted","Data":"7eed72a3af78ec16b3f78dac6a665497e0ea8692a9fc8a4838f4df2fe664caa0"} Dec 09 15:08:29 crc kubenswrapper[4735]: I1209 15:08:29.349061 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" event={"ID":"9faa308c-34e4-40c7-996f-a8087be1b78c","Type":"ContainerStarted","Data":"1a5790bbf681fe0d18f86b9b9431f237ddbbff627d84e5b28567de479e9fe3fe"} Dec 09 15:08:29 crc kubenswrapper[4735]: I1209 15:08:29.349068 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" event={"ID":"9faa308c-34e4-40c7-996f-a8087be1b78c","Type":"ContainerStarted","Data":"128985a5db1b8cd9d4ed14f1e182980ed38f332ff4da43c27b1549199c18299d"} Dec 09 15:08:29 crc kubenswrapper[4735]: I1209 15:08:29.403041 4735 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-operators/perses-operator-5446b9c989-w2xlx" secret="" err="failed to sync secret cache: timed out waiting for the condition" Dec 09 15:08:29 crc kubenswrapper[4735]: I1209 15:08:29.403094 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:29 crc kubenswrapper[4735]: E1209 15:08:29.427335 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(8938741265ffee0a93fdc84223176aee23d96d60146701680a84679a29e791c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:29 crc kubenswrapper[4735]: E1209 15:08:29.427382 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(8938741265ffee0a93fdc84223176aee23d96d60146701680a84679a29e791c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:29 crc kubenswrapper[4735]: E1209 15:08:29.427400 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(8938741265ffee0a93fdc84223176aee23d96d60146701680a84679a29e791c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:29 crc kubenswrapper[4735]: E1209 15:08:29.427431 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-w2xlx_openshift-operators(2c5b3de4-5006-4a83-b672-1fa5f2bf2cec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-w2xlx_openshift-operators(2c5b3de4-5006-4a83-b672-1fa5f2bf2cec)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(8938741265ffee0a93fdc84223176aee23d96d60146701680a84679a29e791c8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" podUID="2c5b3de4-5006-4a83-b672-1fa5f2bf2cec" Dec 09 15:08:29 crc kubenswrapper[4735]: I1209 15:08:29.547745 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-f4rnb" Dec 09 15:08:31 crc kubenswrapper[4735]: I1209 15:08:31.360720 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" event={"ID":"9faa308c-34e4-40c7-996f-a8087be1b78c","Type":"ContainerStarted","Data":"d85cdd64192993c382eb01c489779b72aaa9e266a9f2ee65ac1c5edface8198e"} Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.373624 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" event={"ID":"9faa308c-34e4-40c7-996f-a8087be1b78c","Type":"ContainerStarted","Data":"de0277bd1e68d87d3e486f21e6fb0bc6247e322ef7bc5e5221334be6979bf371"} Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.374103 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.374116 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.374124 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.399487 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.404044 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.407862 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" podStartSLOduration=7.407847878 podStartE2EDuration="7.407847878s" podCreationTimestamp="2025-12-09 15:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:08:33.404761777 +0000 UTC m=+592.329600406" watchObservedRunningTime="2025-12-09 15:08:33.407847878 +0000 UTC m=+592.332686506" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.808061 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47"] Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.808154 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.808384 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.812300 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp"] Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.812358 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.812558 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.827290 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-h5bn2"] Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.827398 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.827838 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.832543 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-w2xlx"] Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.832645 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.833015 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.835642 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(1195f5f4638d7f9c76edb51177017101d1c8e986a70e62c19f4d40b2e39e4f95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.835687 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(1195f5f4638d7f9c76edb51177017101d1c8e986a70e62c19f4d40b2e39e4f95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.835714 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(1195f5f4638d7f9c76edb51177017101d1c8e986a70e62c19f4d40b2e39e4f95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.835750 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators(0007731d-b209-44eb-b95a-5b3b95a02ac2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators(0007731d-b209-44eb-b95a-5b3b95a02ac2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(1195f5f4638d7f9c76edb51177017101d1c8e986a70e62c19f4d40b2e39e4f95): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" podUID="0007731d-b209-44eb-b95a-5b3b95a02ac2" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.842833 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(98dcd0435b0724a833c11bf7400c1e671441dcc23bbe343d1385bb68eb597622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.842871 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(98dcd0435b0724a833c11bf7400c1e671441dcc23bbe343d1385bb68eb597622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.842889 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(98dcd0435b0724a833c11bf7400c1e671441dcc23bbe343d1385bb68eb597622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.842917 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators(bfe62381-3825-4204-bbf4-8970225de2c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators(bfe62381-3825-4204-bbf4-8970225de2c4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(98dcd0435b0724a833c11bf7400c1e671441dcc23bbe343d1385bb68eb597622): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" podUID="bfe62381-3825-4204-bbf4-8970225de2c4" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.858807 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8"] Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.858915 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:33 crc kubenswrapper[4735]: I1209 15:08:33.859362 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.862691 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(d8700fcb9dedda0cfa8803a5548effad721281261d28608892f33713a0b983be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.862762 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(d8700fcb9dedda0cfa8803a5548effad721281261d28608892f33713a0b983be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.862784 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(d8700fcb9dedda0cfa8803a5548effad721281261d28608892f33713a0b983be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.862826 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-h5bn2_openshift-operators(63cae057-68b9-4d57-a64e-fd9314da6cfd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-h5bn2_openshift-operators(63cae057-68b9-4d57-a64e-fd9314da6cfd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(d8700fcb9dedda0cfa8803a5548effad721281261d28608892f33713a0b983be): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" podUID="63cae057-68b9-4d57-a64e-fd9314da6cfd" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.868343 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(9946ebaaeca3d5eb2101c69cb324564580e0727728db08ce006545fcb28622f6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.868392 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(9946ebaaeca3d5eb2101c69cb324564580e0727728db08ce006545fcb28622f6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.868422 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(9946ebaaeca3d5eb2101c69cb324564580e0727728db08ce006545fcb28622f6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.868462 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-w2xlx_openshift-operators(2c5b3de4-5006-4a83-b672-1fa5f2bf2cec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-w2xlx_openshift-operators(2c5b3de4-5006-4a83-b672-1fa5f2bf2cec)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(9946ebaaeca3d5eb2101c69cb324564580e0727728db08ce006545fcb28622f6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" podUID="2c5b3de4-5006-4a83-b672-1fa5f2bf2cec" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.893833 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(f540494f81b2492318639c89cf270a3470d1336d97ffd919b1b7d589282c60b1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.893882 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(f540494f81b2492318639c89cf270a3470d1336d97ffd919b1b7d589282c60b1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.893901 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(f540494f81b2492318639c89cf270a3470d1336d97ffd919b1b7d589282c60b1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:33 crc kubenswrapper[4735]: E1209 15:08:33.893938 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators(205460a8-cdae-43e1-8b9c-123f7f4f8c29)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators(205460a8-cdae-43e1-8b9c-123f7f4f8c29)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(f540494f81b2492318639c89cf270a3470d1336d97ffd919b1b7d589282c60b1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" podUID="205460a8-cdae-43e1-8b9c-123f7f4f8c29" Dec 09 15:08:38 crc kubenswrapper[4735]: I1209 15:08:38.414503 4735 scope.go:117] "RemoveContainer" containerID="88bc3bb0b0d1327a3335aadac40c46ac49a79d37f1e1436ccb892cbaa982f40d" Dec 09 15:08:38 crc kubenswrapper[4735]: E1209 15:08:38.416525 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xnf8f_openshift-multus(67d17a09-b547-49cf-8195-5af12413f51c)\"" pod="openshift-multus/multus-xnf8f" podUID="67d17a09-b547-49cf-8195-5af12413f51c" Dec 09 15:08:41 crc kubenswrapper[4735]: I1209 15:08:41.549564 4735 scope.go:117] "RemoveContainer" containerID="70c317c67e8e5ce91441feb6ef1b7ba908dc0fbdf0e17d6bbc441e6845955fed" Dec 09 15:08:42 crc kubenswrapper[4735]: I1209 15:08:42.417912 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnf8f_67d17a09-b547-49cf-8195-5af12413f51c/kube-multus/2.log" Dec 09 15:08:44 crc kubenswrapper[4735]: I1209 15:08:44.413297 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:44 crc kubenswrapper[4735]: I1209 15:08:44.413999 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:44 crc kubenswrapper[4735]: E1209 15:08:44.439174 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(51fa59d82a32bfd48b8f8a874fe02f0e158cb85e747d210cb5c27858650a6a28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:44 crc kubenswrapper[4735]: E1209 15:08:44.439226 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(51fa59d82a32bfd48b8f8a874fe02f0e158cb85e747d210cb5c27858650a6a28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:44 crc kubenswrapper[4735]: E1209 15:08:44.439247 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(51fa59d82a32bfd48b8f8a874fe02f0e158cb85e747d210cb5c27858650a6a28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:44 crc kubenswrapper[4735]: E1209 15:08:44.439290 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-h5bn2_openshift-operators(63cae057-68b9-4d57-a64e-fd9314da6cfd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-h5bn2_openshift-operators(63cae057-68b9-4d57-a64e-fd9314da6cfd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-h5bn2_openshift-operators_63cae057-68b9-4d57-a64e-fd9314da6cfd_0(51fa59d82a32bfd48b8f8a874fe02f0e158cb85e747d210cb5c27858650a6a28): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" podUID="63cae057-68b9-4d57-a64e-fd9314da6cfd" Dec 09 15:08:46 crc kubenswrapper[4735]: I1209 15:08:46.413919 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:46 crc kubenswrapper[4735]: I1209 15:08:46.414369 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:46 crc kubenswrapper[4735]: E1209 15:08:46.436568 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(10f1d814e4eabba6a3544ed78fd7c2b23a2c9d7dfdf11ed673c2051e4e3f66fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:46 crc kubenswrapper[4735]: E1209 15:08:46.436880 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(10f1d814e4eabba6a3544ed78fd7c2b23a2c9d7dfdf11ed673c2051e4e3f66fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:46 crc kubenswrapper[4735]: E1209 15:08:46.436952 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(10f1d814e4eabba6a3544ed78fd7c2b23a2c9d7dfdf11ed673c2051e4e3f66fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:46 crc kubenswrapper[4735]: E1209 15:08:46.437050 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators(205460a8-cdae-43e1-8b9c-123f7f4f8c29)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators(205460a8-cdae-43e1-8b9c-123f7f4f8c29)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_openshift-operators_205460a8-cdae-43e1-8b9c-123f7f4f8c29_0(10f1d814e4eabba6a3544ed78fd7c2b23a2c9d7dfdf11ed673c2051e4e3f66fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" podUID="205460a8-cdae-43e1-8b9c-123f7f4f8c29" Dec 09 15:08:47 crc kubenswrapper[4735]: I1209 15:08:47.413447 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:47 crc kubenswrapper[4735]: I1209 15:08:47.413468 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:47 crc kubenswrapper[4735]: I1209 15:08:47.413958 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:47 crc kubenswrapper[4735]: I1209 15:08:47.414232 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:47 crc kubenswrapper[4735]: E1209 15:08:47.435741 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(e25c26eef87aeb9808c9b17890e61d9982fd46da351e05d83a2eb56a0a5fdb8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:47 crc kubenswrapper[4735]: E1209 15:08:47.435788 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(e25c26eef87aeb9808c9b17890e61d9982fd46da351e05d83a2eb56a0a5fdb8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:47 crc kubenswrapper[4735]: E1209 15:08:47.435813 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(e25c26eef87aeb9808c9b17890e61d9982fd46da351e05d83a2eb56a0a5fdb8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:08:47 crc kubenswrapper[4735]: E1209 15:08:47.435846 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators(bfe62381-3825-4204-bbf4-8970225de2c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators(bfe62381-3825-4204-bbf4-8970225de2c4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-nqm47_openshift-operators_bfe62381-3825-4204-bbf4-8970225de2c4_0(e25c26eef87aeb9808c9b17890e61d9982fd46da351e05d83a2eb56a0a5fdb8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" podUID="bfe62381-3825-4204-bbf4-8970225de2c4" Dec 09 15:08:47 crc kubenswrapper[4735]: E1209 15:08:47.445047 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(a7f9a9a5f6456de5d535ad2ed4168b60b3a73888571374f186b1d73cfcfa6fd0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:47 crc kubenswrapper[4735]: E1209 15:08:47.445087 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(a7f9a9a5f6456de5d535ad2ed4168b60b3a73888571374f186b1d73cfcfa6fd0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:47 crc kubenswrapper[4735]: E1209 15:08:47.445107 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(a7f9a9a5f6456de5d535ad2ed4168b60b3a73888571374f186b1d73cfcfa6fd0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:08:47 crc kubenswrapper[4735]: E1209 15:08:47.445142 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-w2xlx_openshift-operators(2c5b3de4-5006-4a83-b672-1fa5f2bf2cec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-w2xlx_openshift-operators(2c5b3de4-5006-4a83-b672-1fa5f2bf2cec)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-w2xlx_openshift-operators_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec_0(a7f9a9a5f6456de5d535ad2ed4168b60b3a73888571374f186b1d73cfcfa6fd0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" podUID="2c5b3de4-5006-4a83-b672-1fa5f2bf2cec" Dec 09 15:08:48 crc kubenswrapper[4735]: I1209 15:08:48.413769 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:48 crc kubenswrapper[4735]: I1209 15:08:48.414621 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:48 crc kubenswrapper[4735]: E1209 15:08:48.436328 4735 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(71109afa08513ca5feacfca96a1cf34d96bbe182d2f976a61e512c0f8808154a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 09 15:08:48 crc kubenswrapper[4735]: E1209 15:08:48.436389 4735 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(71109afa08513ca5feacfca96a1cf34d96bbe182d2f976a61e512c0f8808154a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:48 crc kubenswrapper[4735]: E1209 15:08:48.436411 4735 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(71109afa08513ca5feacfca96a1cf34d96bbe182d2f976a61e512c0f8808154a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:48 crc kubenswrapper[4735]: E1209 15:08:48.436502 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators(0007731d-b209-44eb-b95a-5b3b95a02ac2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators(0007731d-b209-44eb-b95a-5b3b95a02ac2)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_openshift-operators_0007731d-b209-44eb-b95a-5b3b95a02ac2_0(71109afa08513ca5feacfca96a1cf34d96bbe182d2f976a61e512c0f8808154a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" podUID="0007731d-b209-44eb-b95a-5b3b95a02ac2" Dec 09 15:08:50 crc kubenswrapper[4735]: I1209 15:08:50.414007 4735 scope.go:117] "RemoveContainer" containerID="88bc3bb0b0d1327a3335aadac40c46ac49a79d37f1e1436ccb892cbaa982f40d" Dec 09 15:08:51 crc kubenswrapper[4735]: I1209 15:08:51.459849 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xnf8f_67d17a09-b547-49cf-8195-5af12413f51c/kube-multus/2.log" Dec 09 15:08:51 crc kubenswrapper[4735]: I1209 15:08:51.460633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xnf8f" event={"ID":"67d17a09-b547-49cf-8195-5af12413f51c","Type":"ContainerStarted","Data":"e1bdc9f2e9fec7df3c252965200fce9a497ccbfba06c80a84fd6628de92f0dcd"} Dec 09 15:08:57 crc kubenswrapper[4735]: I1209 15:08:57.254780 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf92r" Dec 09 15:08:57 crc kubenswrapper[4735]: I1209 15:08:57.413230 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:57 crc kubenswrapper[4735]: I1209 15:08:57.413769 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" Dec 09 15:08:57 crc kubenswrapper[4735]: I1209 15:08:57.562865 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8"] Dec 09 15:08:58 crc kubenswrapper[4735]: I1209 15:08:58.413271 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:58 crc kubenswrapper[4735]: I1209 15:08:58.413552 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:08:58 crc kubenswrapper[4735]: I1209 15:08:58.488452 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" event={"ID":"205460a8-cdae-43e1-8b9c-123f7f4f8c29","Type":"ContainerStarted","Data":"dac4a205d2b83d9b547dedfd87cb63a84aa9f0eb08f5f043a2008abc5329456e"} Dec 09 15:08:58 crc kubenswrapper[4735]: I1209 15:08:58.555444 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-h5bn2"] Dec 09 15:08:59 crc kubenswrapper[4735]: I1209 15:08:59.413716 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:59 crc kubenswrapper[4735]: I1209 15:08:59.414117 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" Dec 09 15:08:59 crc kubenswrapper[4735]: I1209 15:08:59.494336 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" event={"ID":"63cae057-68b9-4d57-a64e-fd9314da6cfd","Type":"ContainerStarted","Data":"7838df84dd5948f3bac89263a71cb9a3b110ba790b25eeae87c6133f576f4373"} Dec 09 15:08:59 crc kubenswrapper[4735]: I1209 15:08:59.759264 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp"] Dec 09 15:08:59 crc kubenswrapper[4735]: W1209 15:08:59.766209 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0007731d_b209_44eb_b95a_5b3b95a02ac2.slice/crio-b0a04d81449dfdb84543740a315f64d5f4efd08b5d978aba26f2870e7c0bda1e WatchSource:0}: Error finding container b0a04d81449dfdb84543740a315f64d5f4efd08b5d978aba26f2870e7c0bda1e: Status 404 returned error can't find the container with id b0a04d81449dfdb84543740a315f64d5f4efd08b5d978aba26f2870e7c0bda1e Dec 09 15:09:00 crc kubenswrapper[4735]: I1209 15:09:00.413171 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:09:00 crc kubenswrapper[4735]: I1209 15:09:00.413456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" Dec 09 15:09:00 crc kubenswrapper[4735]: I1209 15:09:00.513968 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" event={"ID":"0007731d-b209-44eb-b95a-5b3b95a02ac2","Type":"ContainerStarted","Data":"b0a04d81449dfdb84543740a315f64d5f4efd08b5d978aba26f2870e7c0bda1e"} Dec 09 15:09:02 crc kubenswrapper[4735]: I1209 15:09:02.413112 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:09:02 crc kubenswrapper[4735]: I1209 15:09:02.413721 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:09:02 crc kubenswrapper[4735]: I1209 15:09:02.629975 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47"] Dec 09 15:09:02 crc kubenswrapper[4735]: W1209 15:09:02.634797 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfe62381_3825_4204_bbf4_8970225de2c4.slice/crio-52aa38b3449d2b12942de5c488fb3da53565cec0f9e2fe9739be8f5c84a78b46 WatchSource:0}: Error finding container 52aa38b3449d2b12942de5c488fb3da53565cec0f9e2fe9739be8f5c84a78b46: Status 404 returned error can't find the container with id 52aa38b3449d2b12942de5c488fb3da53565cec0f9e2fe9739be8f5c84a78b46 Dec 09 15:09:02 crc kubenswrapper[4735]: I1209 15:09:02.665703 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-w2xlx"] Dec 09 15:09:02 crc kubenswrapper[4735]: W1209 15:09:02.668876 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c5b3de4_5006_4a83_b672_1fa5f2bf2cec.slice/crio-91fee524da4683c684d5aaca3d4d077ab3ac16849dae73d6cdd3437f936a1201 WatchSource:0}: Error finding container 91fee524da4683c684d5aaca3d4d077ab3ac16849dae73d6cdd3437f936a1201: Status 404 returned error can't find the container with id 91fee524da4683c684d5aaca3d4d077ab3ac16849dae73d6cdd3437f936a1201 Dec 09 15:09:03 crc kubenswrapper[4735]: I1209 15:09:03.534760 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" event={"ID":"0007731d-b209-44eb-b95a-5b3b95a02ac2","Type":"ContainerStarted","Data":"5cde9a4c3839e16e7e40993d7ce92a79ce720b177992e73223c32fec71cabc35"} Dec 09 15:09:03 crc kubenswrapper[4735]: I1209 15:09:03.535639 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" event={"ID":"2c5b3de4-5006-4a83-b672-1fa5f2bf2cec","Type":"ContainerStarted","Data":"91fee524da4683c684d5aaca3d4d077ab3ac16849dae73d6cdd3437f936a1201"} Dec 09 15:09:03 crc kubenswrapper[4735]: I1209 15:09:03.536669 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" event={"ID":"bfe62381-3825-4204-bbf4-8970225de2c4","Type":"ContainerStarted","Data":"52aa38b3449d2b12942de5c488fb3da53565cec0f9e2fe9739be8f5c84a78b46"} Dec 09 15:09:03 crc kubenswrapper[4735]: I1209 15:09:03.538262 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" event={"ID":"205460a8-cdae-43e1-8b9c-123f7f4f8c29","Type":"ContainerStarted","Data":"10a92d1d7dac9f0e902578ea5527db5d27b75815e9ec7a6f2cfd63c0c52b848b"} Dec 09 15:09:03 crc kubenswrapper[4735]: I1209 15:09:03.572271 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8" podStartSLOduration=31.680711556 podStartE2EDuration="36.572253449s" podCreationTimestamp="2025-12-09 15:08:27 +0000 UTC" firstStartedPulling="2025-12-09 15:08:57.574285868 +0000 UTC m=+616.499124496" lastFinishedPulling="2025-12-09 15:09:02.465827761 +0000 UTC m=+621.390666389" observedRunningTime="2025-12-09 15:09:03.571721169 +0000 UTC m=+622.496559797" watchObservedRunningTime="2025-12-09 15:09:03.572253449 +0000 UTC m=+622.497092078" Dec 09 15:09:03 crc kubenswrapper[4735]: I1209 15:09:03.575051 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp" podStartSLOduration=33.893446394 podStartE2EDuration="36.575044155s" podCreationTimestamp="2025-12-09 15:08:27 +0000 UTC" firstStartedPulling="2025-12-09 15:08:59.767897514 +0000 UTC m=+618.692736143" lastFinishedPulling="2025-12-09 15:09:02.449495275 +0000 UTC m=+621.374333904" observedRunningTime="2025-12-09 15:09:03.555155495 +0000 UTC m=+622.479994123" watchObservedRunningTime="2025-12-09 15:09:03.575044155 +0000 UTC m=+622.499882782" Dec 09 15:09:07 crc kubenswrapper[4735]: I1209 15:09:07.559713 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" event={"ID":"2c5b3de4-5006-4a83-b672-1fa5f2bf2cec","Type":"ContainerStarted","Data":"8bb6ffef270ee1a1baf9a10902ef750904d9a240435f3098f985fa1a2fa40fa4"} Dec 09 15:09:07 crc kubenswrapper[4735]: I1209 15:09:07.560099 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:09:07 crc kubenswrapper[4735]: I1209 15:09:07.561530 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" event={"ID":"bfe62381-3825-4204-bbf4-8970225de2c4","Type":"ContainerStarted","Data":"387ce8ce878c17c808dcac84a46ec2a2e3364f1466e55a63f0e181f0a023564a"} Dec 09 15:09:07 crc kubenswrapper[4735]: I1209 15:09:07.563928 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" event={"ID":"63cae057-68b9-4d57-a64e-fd9314da6cfd","Type":"ContainerStarted","Data":"cf26d9a7ba67256e1a9885fa81b0f5f36f664d3356107ae273f4b6e631537604"} Dec 09 15:09:07 crc kubenswrapper[4735]: I1209 15:09:07.564600 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:09:07 crc kubenswrapper[4735]: I1209 15:09:07.566729 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" Dec 09 15:09:07 crc kubenswrapper[4735]: I1209 15:09:07.577306 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" podStartSLOduration=35.77733491 podStartE2EDuration="39.577287248s" podCreationTimestamp="2025-12-09 15:08:28 +0000 UTC" firstStartedPulling="2025-12-09 15:09:02.671337025 +0000 UTC m=+621.596175644" lastFinishedPulling="2025-12-09 15:09:06.471289354 +0000 UTC m=+625.396127982" observedRunningTime="2025-12-09 15:09:07.574213712 +0000 UTC m=+626.499052340" watchObservedRunningTime="2025-12-09 15:09:07.577287248 +0000 UTC m=+626.502125877" Dec 09 15:09:07 crc kubenswrapper[4735]: I1209 15:09:07.588419 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-nqm47" podStartSLOduration=36.757413917 podStartE2EDuration="40.588407047s" podCreationTimestamp="2025-12-09 15:08:27 +0000 UTC" firstStartedPulling="2025-12-09 15:09:02.636782061 +0000 UTC m=+621.561620688" lastFinishedPulling="2025-12-09 15:09:06.467775189 +0000 UTC m=+625.392613818" observedRunningTime="2025-12-09 15:09:07.585734063 +0000 UTC m=+626.510572691" watchObservedRunningTime="2025-12-09 15:09:07.588407047 +0000 UTC m=+626.513245675" Dec 09 15:09:07 crc kubenswrapper[4735]: I1209 15:09:07.607852 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-h5bn2" podStartSLOduration=32.692577535 podStartE2EDuration="40.607832898s" podCreationTimestamp="2025-12-09 15:08:27 +0000 UTC" firstStartedPulling="2025-12-09 15:08:58.579956772 +0000 UTC m=+617.504795400" lastFinishedPulling="2025-12-09 15:09:06.495212136 +0000 UTC m=+625.420050763" observedRunningTime="2025-12-09 15:09:07.606151808 +0000 UTC m=+626.530990436" watchObservedRunningTime="2025-12-09 15:09:07.607832898 +0000 UTC m=+626.532671526" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.136447 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mnkbr"] Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.137359 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mnkbr" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.138991 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.139438 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-j4846" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.143244 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-j9rnk"] Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.143281 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.143894 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-j9rnk" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.145259 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xtfvg" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.147262 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mnkbr"] Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.150260 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sxb9d"] Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.150796 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-sxb9d" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.151814 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-z59gg" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.158662 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-j9rnk"] Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.163927 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sxb9d"] Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.178289 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97q9t\" (UniqueName: \"kubernetes.io/projected/fb15867f-8803-4aa2-b592-5d6267f53c4f-kube-api-access-97q9t\") pod \"cert-manager-webhook-5655c58dd6-sxb9d\" (UID: \"fb15867f-8803-4aa2-b592-5d6267f53c4f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sxb9d" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.178379 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vfhj\" (UniqueName: \"kubernetes.io/projected/835eedbd-5d0a-4837-997e-53d608904958-kube-api-access-4vfhj\") pod \"cert-manager-5b446d88c5-j9rnk\" (UID: \"835eedbd-5d0a-4837-997e-53d608904958\") " pod="cert-manager/cert-manager-5b446d88c5-j9rnk" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.178530 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chq55\" (UniqueName: \"kubernetes.io/projected/633cc636-711f-4c6f-9b1e-a8ed2b60f487-kube-api-access-chq55\") pod \"cert-manager-cainjector-7f985d654d-mnkbr\" (UID: \"633cc636-711f-4c6f-9b1e-a8ed2b60f487\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mnkbr" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.279293 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chq55\" (UniqueName: \"kubernetes.io/projected/633cc636-711f-4c6f-9b1e-a8ed2b60f487-kube-api-access-chq55\") pod \"cert-manager-cainjector-7f985d654d-mnkbr\" (UID: \"633cc636-711f-4c6f-9b1e-a8ed2b60f487\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mnkbr" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.279394 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97q9t\" (UniqueName: \"kubernetes.io/projected/fb15867f-8803-4aa2-b592-5d6267f53c4f-kube-api-access-97q9t\") pod \"cert-manager-webhook-5655c58dd6-sxb9d\" (UID: \"fb15867f-8803-4aa2-b592-5d6267f53c4f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sxb9d" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.279448 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vfhj\" (UniqueName: \"kubernetes.io/projected/835eedbd-5d0a-4837-997e-53d608904958-kube-api-access-4vfhj\") pod \"cert-manager-5b446d88c5-j9rnk\" (UID: \"835eedbd-5d0a-4837-997e-53d608904958\") " pod="cert-manager/cert-manager-5b446d88c5-j9rnk" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.297353 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vfhj\" (UniqueName: \"kubernetes.io/projected/835eedbd-5d0a-4837-997e-53d608904958-kube-api-access-4vfhj\") pod \"cert-manager-5b446d88c5-j9rnk\" (UID: \"835eedbd-5d0a-4837-997e-53d608904958\") " pod="cert-manager/cert-manager-5b446d88c5-j9rnk" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.297354 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97q9t\" (UniqueName: \"kubernetes.io/projected/fb15867f-8803-4aa2-b592-5d6267f53c4f-kube-api-access-97q9t\") pod \"cert-manager-webhook-5655c58dd6-sxb9d\" (UID: \"fb15867f-8803-4aa2-b592-5d6267f53c4f\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sxb9d" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.298321 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chq55\" (UniqueName: \"kubernetes.io/projected/633cc636-711f-4c6f-9b1e-a8ed2b60f487-kube-api-access-chq55\") pod \"cert-manager-cainjector-7f985d654d-mnkbr\" (UID: \"633cc636-711f-4c6f-9b1e-a8ed2b60f487\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mnkbr" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.451563 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mnkbr" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.460786 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-j9rnk" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.466181 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-sxb9d" Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.839113 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sxb9d"] Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.869930 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mnkbr"] Dec 09 15:09:12 crc kubenswrapper[4735]: I1209 15:09:12.878560 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-j9rnk"] Dec 09 15:09:12 crc kubenswrapper[4735]: W1209 15:09:12.880372 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod835eedbd_5d0a_4837_997e_53d608904958.slice/crio-e62895e91b7bcd8f7825c6125aa7b3fc816d8176f0a131c0ca17a9826decfc5c WatchSource:0}: Error finding container e62895e91b7bcd8f7825c6125aa7b3fc816d8176f0a131c0ca17a9826decfc5c: Status 404 returned error can't find the container with id e62895e91b7bcd8f7825c6125aa7b3fc816d8176f0a131c0ca17a9826decfc5c Dec 09 15:09:13 crc kubenswrapper[4735]: I1209 15:09:13.597715 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mnkbr" event={"ID":"633cc636-711f-4c6f-9b1e-a8ed2b60f487","Type":"ContainerStarted","Data":"f1a61533e80c9994d4575c77ee71bc6832607f375b988b6eda4d34ee35348bc3"} Dec 09 15:09:13 crc kubenswrapper[4735]: I1209 15:09:13.600196 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-j9rnk" event={"ID":"835eedbd-5d0a-4837-997e-53d608904958","Type":"ContainerStarted","Data":"e62895e91b7bcd8f7825c6125aa7b3fc816d8176f0a131c0ca17a9826decfc5c"} Dec 09 15:09:13 crc kubenswrapper[4735]: I1209 15:09:13.602112 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-sxb9d" event={"ID":"fb15867f-8803-4aa2-b592-5d6267f53c4f","Type":"ContainerStarted","Data":"e22fd7b1abf0331e90060c5adfda5778e711684a25dafcc67ce74f4b80211094"} Dec 09 15:09:15 crc kubenswrapper[4735]: I1209 15:09:15.616969 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-j9rnk" event={"ID":"835eedbd-5d0a-4837-997e-53d608904958","Type":"ContainerStarted","Data":"ecaa10e3a6e9dce471d41bbc1862292135b50a487d657b7db5d88489bd2e3fca"} Dec 09 15:09:15 crc kubenswrapper[4735]: I1209 15:09:15.618260 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mnkbr" event={"ID":"633cc636-711f-4c6f-9b1e-a8ed2b60f487","Type":"ContainerStarted","Data":"171ed931af583808320a3ef8221f2add7228da3f7d67e80fb314137deaa6d9bc"} Dec 09 15:09:15 crc kubenswrapper[4735]: I1209 15:09:15.630474 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-j9rnk" podStartSLOduration=1.347079288 podStartE2EDuration="3.630460191s" podCreationTimestamp="2025-12-09 15:09:12 +0000 UTC" firstStartedPulling="2025-12-09 15:09:12.882826659 +0000 UTC m=+631.807665287" lastFinishedPulling="2025-12-09 15:09:15.166207562 +0000 UTC m=+634.091046190" observedRunningTime="2025-12-09 15:09:15.628440364 +0000 UTC m=+634.553278992" watchObservedRunningTime="2025-12-09 15:09:15.630460191 +0000 UTC m=+634.555298818" Dec 09 15:09:15 crc kubenswrapper[4735]: I1209 15:09:15.639892 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-mnkbr" podStartSLOduration=1.319261906 podStartE2EDuration="3.63988323s" podCreationTimestamp="2025-12-09 15:09:12 +0000 UTC" firstStartedPulling="2025-12-09 15:09:12.875373591 +0000 UTC m=+631.800212219" lastFinishedPulling="2025-12-09 15:09:15.195994916 +0000 UTC m=+634.120833543" observedRunningTime="2025-12-09 15:09:15.638553782 +0000 UTC m=+634.563392410" watchObservedRunningTime="2025-12-09 15:09:15.63988323 +0000 UTC m=+634.564721859" Dec 09 15:09:16 crc kubenswrapper[4735]: I1209 15:09:16.623262 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-sxb9d" event={"ID":"fb15867f-8803-4aa2-b592-5d6267f53c4f","Type":"ContainerStarted","Data":"886a350d483bad7986ac03c08ecb150f0b9088756a20ba88083a072849bf935d"} Dec 09 15:09:16 crc kubenswrapper[4735]: I1209 15:09:16.636592 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-sxb9d" podStartSLOduration=1.654272049 podStartE2EDuration="4.636568668s" podCreationTimestamp="2025-12-09 15:09:12 +0000 UTC" firstStartedPulling="2025-12-09 15:09:12.852469885 +0000 UTC m=+631.777308513" lastFinishedPulling="2025-12-09 15:09:15.834766503 +0000 UTC m=+634.759605132" observedRunningTime="2025-12-09 15:09:16.636017463 +0000 UTC m=+635.560856101" watchObservedRunningTime="2025-12-09 15:09:16.636568668 +0000 UTC m=+635.561407296" Dec 09 15:09:17 crc kubenswrapper[4735]: I1209 15:09:17.467046 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-sxb9d" Dec 09 15:09:19 crc kubenswrapper[4735]: I1209 15:09:19.406966 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-w2xlx" Dec 09 15:09:22 crc kubenswrapper[4735]: I1209 15:09:22.469088 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-sxb9d" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.167329 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59"] Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.168990 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.170827 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.180366 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59"] Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.257973 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.258016 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7dnr\" (UniqueName: \"kubernetes.io/projected/714ef1b2-0a49-4258-aee7-e551f87c0ef4-kube-api-access-q7dnr\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.258053 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.308333 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp"] Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.310059 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.314845 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp"] Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.358983 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p846q\" (UniqueName: \"kubernetes.io/projected/abd436d6-c0b4-4779-a531-98449b2755da-kube-api-access-p846q\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.359229 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.359257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7dnr\" (UniqueName: \"kubernetes.io/projected/714ef1b2-0a49-4258-aee7-e551f87c0ef4-kube-api-access-q7dnr\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.359294 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.359347 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.359369 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.359658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.359722 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.374584 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7dnr\" (UniqueName: \"kubernetes.io/projected/714ef1b2-0a49-4258-aee7-e551f87c0ef4-kube-api-access-q7dnr\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.460680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.460722 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.460753 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p846q\" (UniqueName: \"kubernetes.io/projected/abd436d6-c0b4-4779-a531-98449b2755da-kube-api-access-p846q\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.461082 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.461105 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.477248 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p846q\" (UniqueName: \"kubernetes.io/projected/abd436d6-c0b4-4779-a531-98449b2755da-kube-api-access-p846q\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.481967 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.622297 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.762626 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp"] Dec 09 15:09:43 crc kubenswrapper[4735]: W1209 15:09:43.766593 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabd436d6_c0b4_4779_a531_98449b2755da.slice/crio-666eb912530e6c70f7adcf5c36229ec909d46395f07c42a6f0b49f2b7b6d2d1a WatchSource:0}: Error finding container 666eb912530e6c70f7adcf5c36229ec909d46395f07c42a6f0b49f2b7b6d2d1a: Status 404 returned error can't find the container with id 666eb912530e6c70f7adcf5c36229ec909d46395f07c42a6f0b49f2b7b6d2d1a Dec 09 15:09:43 crc kubenswrapper[4735]: I1209 15:09:43.840772 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59"] Dec 09 15:09:43 crc kubenswrapper[4735]: W1209 15:09:43.844657 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod714ef1b2_0a49_4258_aee7_e551f87c0ef4.slice/crio-0e50d9e7a8e2bcceb8562d89a1f360f7fd1f8b04f747c17c09d1031dae3e6608 WatchSource:0}: Error finding container 0e50d9e7a8e2bcceb8562d89a1f360f7fd1f8b04f747c17c09d1031dae3e6608: Status 404 returned error can't find the container with id 0e50d9e7a8e2bcceb8562d89a1f360f7fd1f8b04f747c17c09d1031dae3e6608 Dec 09 15:09:44 crc kubenswrapper[4735]: I1209 15:09:44.753764 4735 generic.go:334] "Generic (PLEG): container finished" podID="714ef1b2-0a49-4258-aee7-e551f87c0ef4" containerID="0cd91f516f477ebd8397ce809483024a9e68ebaffc6e3b9acf3402cf6d7f6936" exitCode=0 Dec 09 15:09:44 crc kubenswrapper[4735]: I1209 15:09:44.753815 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" event={"ID":"714ef1b2-0a49-4258-aee7-e551f87c0ef4","Type":"ContainerDied","Data":"0cd91f516f477ebd8397ce809483024a9e68ebaffc6e3b9acf3402cf6d7f6936"} Dec 09 15:09:44 crc kubenswrapper[4735]: I1209 15:09:44.754039 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" event={"ID":"714ef1b2-0a49-4258-aee7-e551f87c0ef4","Type":"ContainerStarted","Data":"0e50d9e7a8e2bcceb8562d89a1f360f7fd1f8b04f747c17c09d1031dae3e6608"} Dec 09 15:09:44 crc kubenswrapper[4735]: I1209 15:09:44.755364 4735 generic.go:334] "Generic (PLEG): container finished" podID="abd436d6-c0b4-4779-a531-98449b2755da" containerID="6d0f4c81677177d3848178bb62999b3f1e9563e56cbc407e45f82623e03a84e1" exitCode=0 Dec 09 15:09:44 crc kubenswrapper[4735]: I1209 15:09:44.755392 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" event={"ID":"abd436d6-c0b4-4779-a531-98449b2755da","Type":"ContainerDied","Data":"6d0f4c81677177d3848178bb62999b3f1e9563e56cbc407e45f82623e03a84e1"} Dec 09 15:09:44 crc kubenswrapper[4735]: I1209 15:09:44.755411 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" event={"ID":"abd436d6-c0b4-4779-a531-98449b2755da","Type":"ContainerStarted","Data":"666eb912530e6c70f7adcf5c36229ec909d46395f07c42a6f0b49f2b7b6d2d1a"} Dec 09 15:09:46 crc kubenswrapper[4735]: I1209 15:09:46.765542 4735 generic.go:334] "Generic (PLEG): container finished" podID="abd436d6-c0b4-4779-a531-98449b2755da" containerID="586f38a2b6cb769e9a7e0ed1b0986ac511043753b2db53dc88e5788b052982d4" exitCode=0 Dec 09 15:09:46 crc kubenswrapper[4735]: I1209 15:09:46.765608 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" event={"ID":"abd436d6-c0b4-4779-a531-98449b2755da","Type":"ContainerDied","Data":"586f38a2b6cb769e9a7e0ed1b0986ac511043753b2db53dc88e5788b052982d4"} Dec 09 15:09:46 crc kubenswrapper[4735]: I1209 15:09:46.767080 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" event={"ID":"714ef1b2-0a49-4258-aee7-e551f87c0ef4","Type":"ContainerStarted","Data":"15b45d98e703f30e234d2798c23859e0f486724ee05b0e102a942d2264504e3a"} Dec 09 15:09:47 crc kubenswrapper[4735]: I1209 15:09:47.773876 4735 generic.go:334] "Generic (PLEG): container finished" podID="abd436d6-c0b4-4779-a531-98449b2755da" containerID="129e193bb33930b821e77e2ca7c5ebe6d90bddc557b2efbc86a8b62f88cbab1d" exitCode=0 Dec 09 15:09:47 crc kubenswrapper[4735]: I1209 15:09:47.773943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" event={"ID":"abd436d6-c0b4-4779-a531-98449b2755da","Type":"ContainerDied","Data":"129e193bb33930b821e77e2ca7c5ebe6d90bddc557b2efbc86a8b62f88cbab1d"} Dec 09 15:09:47 crc kubenswrapper[4735]: I1209 15:09:47.775540 4735 generic.go:334] "Generic (PLEG): container finished" podID="714ef1b2-0a49-4258-aee7-e551f87c0ef4" containerID="15b45d98e703f30e234d2798c23859e0f486724ee05b0e102a942d2264504e3a" exitCode=0 Dec 09 15:09:47 crc kubenswrapper[4735]: I1209 15:09:47.775559 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" event={"ID":"714ef1b2-0a49-4258-aee7-e551f87c0ef4","Type":"ContainerDied","Data":"15b45d98e703f30e234d2798c23859e0f486724ee05b0e102a942d2264504e3a"} Dec 09 15:09:48 crc kubenswrapper[4735]: I1209 15:09:48.783243 4735 generic.go:334] "Generic (PLEG): container finished" podID="714ef1b2-0a49-4258-aee7-e551f87c0ef4" containerID="71763b87fe3b7ef7b76a032952fb8863fef1481e718262d8cf7a126b85c38092" exitCode=0 Dec 09 15:09:48 crc kubenswrapper[4735]: I1209 15:09:48.783337 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" event={"ID":"714ef1b2-0a49-4258-aee7-e551f87c0ef4","Type":"ContainerDied","Data":"71763b87fe3b7ef7b76a032952fb8863fef1481e718262d8cf7a126b85c38092"} Dec 09 15:09:48 crc kubenswrapper[4735]: I1209 15:09:48.968229 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.126181 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-bundle\") pod \"abd436d6-c0b4-4779-a531-98449b2755da\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.126272 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p846q\" (UniqueName: \"kubernetes.io/projected/abd436d6-c0b4-4779-a531-98449b2755da-kube-api-access-p846q\") pod \"abd436d6-c0b4-4779-a531-98449b2755da\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.126401 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-util\") pod \"abd436d6-c0b4-4779-a531-98449b2755da\" (UID: \"abd436d6-c0b4-4779-a531-98449b2755da\") " Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.127918 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-bundle" (OuterVolumeSpecName: "bundle") pod "abd436d6-c0b4-4779-a531-98449b2755da" (UID: "abd436d6-c0b4-4779-a531-98449b2755da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.132871 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abd436d6-c0b4-4779-a531-98449b2755da-kube-api-access-p846q" (OuterVolumeSpecName: "kube-api-access-p846q") pod "abd436d6-c0b4-4779-a531-98449b2755da" (UID: "abd436d6-c0b4-4779-a531-98449b2755da"). InnerVolumeSpecName "kube-api-access-p846q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.137245 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-util" (OuterVolumeSpecName: "util") pod "abd436d6-c0b4-4779-a531-98449b2755da" (UID: "abd436d6-c0b4-4779-a531-98449b2755da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.228632 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-util\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.228679 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/abd436d6-c0b4-4779-a531-98449b2755da-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.228692 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p846q\" (UniqueName: \"kubernetes.io/projected/abd436d6-c0b4-4779-a531-98449b2755da-kube-api-access-p846q\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.790066 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" event={"ID":"abd436d6-c0b4-4779-a531-98449b2755da","Type":"ContainerDied","Data":"666eb912530e6c70f7adcf5c36229ec909d46395f07c42a6f0b49f2b7b6d2d1a"} Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.790091 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp" Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.790105 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="666eb912530e6c70f7adcf5c36229ec909d46395f07c42a6f0b49f2b7b6d2d1a" Dec 09 15:09:49 crc kubenswrapper[4735]: I1209 15:09:49.962214 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.140008 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-bundle\") pod \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.140095 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7dnr\" (UniqueName: \"kubernetes.io/projected/714ef1b2-0a49-4258-aee7-e551f87c0ef4-kube-api-access-q7dnr\") pod \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.140148 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-util\") pod \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\" (UID: \"714ef1b2-0a49-4258-aee7-e551f87c0ef4\") " Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.140931 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-bundle" (OuterVolumeSpecName: "bundle") pod "714ef1b2-0a49-4258-aee7-e551f87c0ef4" (UID: "714ef1b2-0a49-4258-aee7-e551f87c0ef4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.145659 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714ef1b2-0a49-4258-aee7-e551f87c0ef4-kube-api-access-q7dnr" (OuterVolumeSpecName: "kube-api-access-q7dnr") pod "714ef1b2-0a49-4258-aee7-e551f87c0ef4" (UID: "714ef1b2-0a49-4258-aee7-e551f87c0ef4"). InnerVolumeSpecName "kube-api-access-q7dnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.147582 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-util" (OuterVolumeSpecName: "util") pod "714ef1b2-0a49-4258-aee7-e551f87c0ef4" (UID: "714ef1b2-0a49-4258-aee7-e551f87c0ef4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.241987 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.242096 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7dnr\" (UniqueName: \"kubernetes.io/projected/714ef1b2-0a49-4258-aee7-e551f87c0ef4-kube-api-access-q7dnr\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.242154 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/714ef1b2-0a49-4258-aee7-e551f87c0ef4-util\") on node \"crc\" DevicePath \"\"" Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.797422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" event={"ID":"714ef1b2-0a49-4258-aee7-e551f87c0ef4","Type":"ContainerDied","Data":"0e50d9e7a8e2bcceb8562d89a1f360f7fd1f8b04f747c17c09d1031dae3e6608"} Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.797462 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e50d9e7a8e2bcceb8562d89a1f360f7fd1f8b04f747c17c09d1031dae3e6608" Dec 09 15:09:50 crc kubenswrapper[4735]: I1209 15:09:50.797490 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.803904 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh"] Dec 09 15:09:59 crc kubenswrapper[4735]: E1209 15:09:59.804306 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714ef1b2-0a49-4258-aee7-e551f87c0ef4" containerName="pull" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.804319 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="714ef1b2-0a49-4258-aee7-e551f87c0ef4" containerName="pull" Dec 09 15:09:59 crc kubenswrapper[4735]: E1209 15:09:59.804336 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd436d6-c0b4-4779-a531-98449b2755da" containerName="pull" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.804341 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd436d6-c0b4-4779-a531-98449b2755da" containerName="pull" Dec 09 15:09:59 crc kubenswrapper[4735]: E1209 15:09:59.804353 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd436d6-c0b4-4779-a531-98449b2755da" containerName="util" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.804358 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd436d6-c0b4-4779-a531-98449b2755da" containerName="util" Dec 09 15:09:59 crc kubenswrapper[4735]: E1209 15:09:59.804367 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714ef1b2-0a49-4258-aee7-e551f87c0ef4" containerName="util" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.804373 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="714ef1b2-0a49-4258-aee7-e551f87c0ef4" containerName="util" Dec 09 15:09:59 crc kubenswrapper[4735]: E1209 15:09:59.804382 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abd436d6-c0b4-4779-a531-98449b2755da" containerName="extract" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.804388 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="abd436d6-c0b4-4779-a531-98449b2755da" containerName="extract" Dec 09 15:09:59 crc kubenswrapper[4735]: E1209 15:09:59.804396 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714ef1b2-0a49-4258-aee7-e551f87c0ef4" containerName="extract" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.804401 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="714ef1b2-0a49-4258-aee7-e551f87c0ef4" containerName="extract" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.804491 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="714ef1b2-0a49-4258-aee7-e551f87c0ef4" containerName="extract" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.804505 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="abd436d6-c0b4-4779-a531-98449b2755da" containerName="extract" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.805081 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.808082 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.808272 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.808903 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.809069 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-ldxfx" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.809187 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.809344 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.832879 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh"] Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.958856 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/393ef666-9d18-4435-bbc9-76acb5636ce7-webhook-cert\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.958911 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/393ef666-9d18-4435-bbc9-76acb5636ce7-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.958943 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/393ef666-9d18-4435-bbc9-76acb5636ce7-manager-config\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.959384 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/393ef666-9d18-4435-bbc9-76acb5636ce7-apiservice-cert\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:09:59 crc kubenswrapper[4735]: I1209 15:09:59.959531 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x9cq\" (UniqueName: \"kubernetes.io/projected/393ef666-9d18-4435-bbc9-76acb5636ce7-kube-api-access-6x9cq\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.060599 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/393ef666-9d18-4435-bbc9-76acb5636ce7-apiservice-cert\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.060677 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x9cq\" (UniqueName: \"kubernetes.io/projected/393ef666-9d18-4435-bbc9-76acb5636ce7-kube-api-access-6x9cq\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.060713 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/393ef666-9d18-4435-bbc9-76acb5636ce7-webhook-cert\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.060735 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/393ef666-9d18-4435-bbc9-76acb5636ce7-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.060757 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/393ef666-9d18-4435-bbc9-76acb5636ce7-manager-config\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.061527 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/393ef666-9d18-4435-bbc9-76acb5636ce7-manager-config\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.065062 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/393ef666-9d18-4435-bbc9-76acb5636ce7-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.066284 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/393ef666-9d18-4435-bbc9-76acb5636ce7-apiservice-cert\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.066830 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/393ef666-9d18-4435-bbc9-76acb5636ce7-webhook-cert\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.073842 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x9cq\" (UniqueName: \"kubernetes.io/projected/393ef666-9d18-4435-bbc9-76acb5636ce7-kube-api-access-6x9cq\") pod \"loki-operator-controller-manager-ff768b6c6-qngzh\" (UID: \"393ef666-9d18-4435-bbc9-76acb5636ce7\") " pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.118110 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.475900 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh"] Dec 09 15:10:00 crc kubenswrapper[4735]: W1209 15:10:00.481596 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod393ef666_9d18_4435_bbc9_76acb5636ce7.slice/crio-fd8100375ea63534b849ab6f1ac52cf6d65d528b4023809cfd85659a1cc896a0 WatchSource:0}: Error finding container fd8100375ea63534b849ab6f1ac52cf6d65d528b4023809cfd85659a1cc896a0: Status 404 returned error can't find the container with id fd8100375ea63534b849ab6f1ac52cf6d65d528b4023809cfd85659a1cc896a0 Dec 09 15:10:00 crc kubenswrapper[4735]: I1209 15:10:00.849472 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" event={"ID":"393ef666-9d18-4435-bbc9-76acb5636ce7","Type":"ContainerStarted","Data":"fd8100375ea63534b849ab6f1ac52cf6d65d528b4023809cfd85659a1cc896a0"} Dec 09 15:10:03 crc kubenswrapper[4735]: I1209 15:10:03.823268 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-gvl4x"] Dec 09 15:10:03 crc kubenswrapper[4735]: I1209 15:10:03.824092 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-gvl4x" Dec 09 15:10:03 crc kubenswrapper[4735]: I1209 15:10:03.825768 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-5ktdh" Dec 09 15:10:03 crc kubenswrapper[4735]: I1209 15:10:03.825977 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Dec 09 15:10:03 crc kubenswrapper[4735]: I1209 15:10:03.826146 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Dec 09 15:10:03 crc kubenswrapper[4735]: I1209 15:10:03.826481 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-gvl4x"] Dec 09 15:10:03 crc kubenswrapper[4735]: I1209 15:10:03.907081 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxfmp\" (UniqueName: \"kubernetes.io/projected/1fe25fc9-0b02-46c3-8654-1db38cfefabc-kube-api-access-wxfmp\") pod \"cluster-logging-operator-ff9846bd-gvl4x\" (UID: \"1fe25fc9-0b02-46c3-8654-1db38cfefabc\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-gvl4x" Dec 09 15:10:04 crc kubenswrapper[4735]: I1209 15:10:04.007895 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxfmp\" (UniqueName: \"kubernetes.io/projected/1fe25fc9-0b02-46c3-8654-1db38cfefabc-kube-api-access-wxfmp\") pod \"cluster-logging-operator-ff9846bd-gvl4x\" (UID: \"1fe25fc9-0b02-46c3-8654-1db38cfefabc\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-gvl4x" Dec 09 15:10:04 crc kubenswrapper[4735]: I1209 15:10:04.024636 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxfmp\" (UniqueName: \"kubernetes.io/projected/1fe25fc9-0b02-46c3-8654-1db38cfefabc-kube-api-access-wxfmp\") pod \"cluster-logging-operator-ff9846bd-gvl4x\" (UID: \"1fe25fc9-0b02-46c3-8654-1db38cfefabc\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-gvl4x" Dec 09 15:10:04 crc kubenswrapper[4735]: I1209 15:10:04.144265 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-gvl4x" Dec 09 15:10:04 crc kubenswrapper[4735]: I1209 15:10:04.335875 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:10:04 crc kubenswrapper[4735]: I1209 15:10:04.336110 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:10:04 crc kubenswrapper[4735]: I1209 15:10:04.361055 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-gvl4x"] Dec 09 15:10:04 crc kubenswrapper[4735]: I1209 15:10:04.871860 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-gvl4x" event={"ID":"1fe25fc9-0b02-46c3-8654-1db38cfefabc","Type":"ContainerStarted","Data":"21dd093fed3fcfae894b29106c7c44d8b75e2b551eaf4a070e34a6f8a609b434"} Dec 09 15:10:05 crc kubenswrapper[4735]: I1209 15:10:05.877586 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" event={"ID":"393ef666-9d18-4435-bbc9-76acb5636ce7","Type":"ContainerStarted","Data":"6e139f5dd0bd90260a2f00ffa445de98204764d5487f06bf12d2d00fe7adca78"} Dec 09 15:10:11 crc kubenswrapper[4735]: I1209 15:10:11.909428 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" event={"ID":"393ef666-9d18-4435-bbc9-76acb5636ce7","Type":"ContainerStarted","Data":"8ca90400c92953e0526deb746c617e6b37415532a8117d5c59cd391b8718b5a0"} Dec 09 15:10:11 crc kubenswrapper[4735]: I1209 15:10:11.909845 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:11 crc kubenswrapper[4735]: I1209 15:10:11.911020 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" Dec 09 15:10:11 crc kubenswrapper[4735]: I1209 15:10:11.924497 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-ff768b6c6-qngzh" podStartSLOduration=2.129452543 podStartE2EDuration="12.92448239s" podCreationTimestamp="2025-12-09 15:09:59 +0000 UTC" firstStartedPulling="2025-12-09 15:10:00.483559075 +0000 UTC m=+679.408397704" lastFinishedPulling="2025-12-09 15:10:11.278588923 +0000 UTC m=+690.203427551" observedRunningTime="2025-12-09 15:10:11.924101224 +0000 UTC m=+690.848939852" watchObservedRunningTime="2025-12-09 15:10:11.92448239 +0000 UTC m=+690.849321019" Dec 09 15:10:18 crc kubenswrapper[4735]: I1209 15:10:18.955232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-gvl4x" event={"ID":"1fe25fc9-0b02-46c3-8654-1db38cfefabc","Type":"ContainerStarted","Data":"bddb523781825b407062dcd620bb5c4d8d4907554a88ee5b7f90f75e04f13940"} Dec 09 15:10:18 crc kubenswrapper[4735]: I1209 15:10:18.967235 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-ff9846bd-gvl4x" podStartSLOduration=1.936862343 podStartE2EDuration="15.967222513s" podCreationTimestamp="2025-12-09 15:10:03 +0000 UTC" firstStartedPulling="2025-12-09 15:10:04.362337976 +0000 UTC m=+683.287176604" lastFinishedPulling="2025-12-09 15:10:18.392698146 +0000 UTC m=+697.317536774" observedRunningTime="2025-12-09 15:10:18.966100193 +0000 UTC m=+697.890938821" watchObservedRunningTime="2025-12-09 15:10:18.967222513 +0000 UTC m=+697.892061141" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.312471 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.314855 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.317448 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.317657 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.324728 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.336458 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e3aa567d-79ec-43fd-a97f-6ac997211433\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3aa567d-79ec-43fd-a97f-6ac997211433\") pod \"minio\" (UID: \"0361d5a3-7ad0-4aca-bd59-8cee72a4d276\") " pod="minio-dev/minio" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.336777 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79556\" (UniqueName: \"kubernetes.io/projected/0361d5a3-7ad0-4aca-bd59-8cee72a4d276-kube-api-access-79556\") pod \"minio\" (UID: \"0361d5a3-7ad0-4aca-bd59-8cee72a4d276\") " pod="minio-dev/minio" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.437066 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e3aa567d-79ec-43fd-a97f-6ac997211433\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3aa567d-79ec-43fd-a97f-6ac997211433\") pod \"minio\" (UID: \"0361d5a3-7ad0-4aca-bd59-8cee72a4d276\") " pod="minio-dev/minio" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.437382 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79556\" (UniqueName: \"kubernetes.io/projected/0361d5a3-7ad0-4aca-bd59-8cee72a4d276-kube-api-access-79556\") pod \"minio\" (UID: \"0361d5a3-7ad0-4aca-bd59-8cee72a4d276\") " pod="minio-dev/minio" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.439234 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.439261 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e3aa567d-79ec-43fd-a97f-6ac997211433\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3aa567d-79ec-43fd-a97f-6ac997211433\") pod \"minio\" (UID: \"0361d5a3-7ad0-4aca-bd59-8cee72a4d276\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/665dc3128b4eca96935b988d75e276f2eedc9e65004a7f11c592cc6e507021d3/globalmount\"" pod="minio-dev/minio" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.453045 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79556\" (UniqueName: \"kubernetes.io/projected/0361d5a3-7ad0-4aca-bd59-8cee72a4d276-kube-api-access-79556\") pod \"minio\" (UID: \"0361d5a3-7ad0-4aca-bd59-8cee72a4d276\") " pod="minio-dev/minio" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.456026 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e3aa567d-79ec-43fd-a97f-6ac997211433\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3aa567d-79ec-43fd-a97f-6ac997211433\") pod \"minio\" (UID: \"0361d5a3-7ad0-4aca-bd59-8cee72a4d276\") " pod="minio-dev/minio" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.631168 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 09 15:10:23 crc kubenswrapper[4735]: I1209 15:10:23.974125 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 09 15:10:24 crc kubenswrapper[4735]: I1209 15:10:24.983648 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"0361d5a3-7ad0-4aca-bd59-8cee72a4d276","Type":"ContainerStarted","Data":"89b8a2feaa823f3fa1999db41ae32df83ea5c41c736ed7f6f88ac0fc418f201c"} Dec 09 15:10:26 crc kubenswrapper[4735]: I1209 15:10:26.994414 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"0361d5a3-7ad0-4aca-bd59-8cee72a4d276","Type":"ContainerStarted","Data":"6e9851bbbdcdb5f85c2c6d4c2563bea6f3cb578e4322b9beae224b8c8672dcd9"} Dec 09 15:10:27 crc kubenswrapper[4735]: I1209 15:10:27.005835 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.577151883 podStartE2EDuration="6.005820403s" podCreationTimestamp="2025-12-09 15:10:21 +0000 UTC" firstStartedPulling="2025-12-09 15:10:23.979939034 +0000 UTC m=+702.904777652" lastFinishedPulling="2025-12-09 15:10:26.408607544 +0000 UTC m=+705.333446172" observedRunningTime="2025-12-09 15:10:27.005168508 +0000 UTC m=+705.930007136" watchObservedRunningTime="2025-12-09 15:10:27.005820403 +0000 UTC m=+705.930659032" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.073210 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8"] Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.074075 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.078721 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.078882 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.078917 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-btnt4" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.078920 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.078949 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.084661 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8"] Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.206749 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-g7jxl"] Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.207504 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.209134 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.209455 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.209604 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.216761 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-g7jxl"] Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.221624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.221666 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.221780 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-228hz\" (UniqueName: \"kubernetes.io/projected/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-kube-api-access-228hz\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.221820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-config\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.221837 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.256458 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb"] Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.257250 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.259368 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.259841 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.269632 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb"] Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.322900 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-228hz\" (UniqueName: \"kubernetes.io/projected/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-kube-api-access-228hz\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.322951 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-config\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.322971 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.323011 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.323039 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.323060 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.323085 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0d84c01-0813-4862-add0-1086e3ca895f-config\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.323105 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzlsj\" (UniqueName: \"kubernetes.io/projected/b0d84c01-0813-4862-add0-1086e3ca895f-kube-api-access-nzlsj\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.323125 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.323152 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.323172 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.324073 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.324298 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-config\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.337797 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.340117 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-55c764d7cd-dftlk"] Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.340981 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.341357 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.343036 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.343189 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.343379 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.343538 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.345857 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.351448 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq"] Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.352280 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.354052 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-685tk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.354418 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-228hz\" (UniqueName: \"kubernetes.io/projected/80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41-kube-api-access-228hz\") pod \"logging-loki-distributor-76cc67bf56-rqpx8\" (UID: \"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.375996 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq"] Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.389739 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.425201 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-55c764d7cd-dftlk"] Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.425773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq2lh\" (UniqueName: \"kubernetes.io/projected/a9909b2d-93ab-4b43-b415-7d52ac5031e4-kube-api-access-pq2lh\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.425807 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.425841 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9909b2d-93ab-4b43-b415-7d52ac5031e4-config\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.425875 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0d84c01-0813-4862-add0-1086e3ca895f-config\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.425893 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9909b2d-93ab-4b43-b415-7d52ac5031e4-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.425950 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzlsj\" (UniqueName: \"kubernetes.io/projected/b0d84c01-0813-4862-add0-1086e3ca895f-kube-api-access-nzlsj\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.425995 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.426023 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/a9909b2d-93ab-4b43-b415-7d52ac5031e4-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.426071 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/a9909b2d-93ab-4b43-b415-7d52ac5031e4-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.426235 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.426280 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.427633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.427874 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0d84c01-0813-4862-add0-1086e3ca895f-config\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.430438 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.430727 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.434023 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/b0d84c01-0813-4862-add0-1086e3ca895f-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.440283 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzlsj\" (UniqueName: \"kubernetes.io/projected/b0d84c01-0813-4862-add0-1086e3ca895f-kube-api-access-nzlsj\") pod \"logging-loki-querier-5895d59bb8-g7jxl\" (UID: \"b0d84c01-0813-4862-add0-1086e3ca895f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.519892 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527323 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dtns\" (UniqueName: \"kubernetes.io/projected/8c1d40e2-618c-45c5-bee9-c620c977a7a5-kube-api-access-8dtns\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527367 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8c1d40e2-618c-45c5-bee9-c620c977a7a5-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527417 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/7d3963f2-7997-4cd6-8d9f-accd69aac83b-tenants\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527436 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r79zv\" (UniqueName: \"kubernetes.io/projected/7d3963f2-7997-4cd6-8d9f-accd69aac83b-kube-api-access-r79zv\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527481 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527588 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-rbac\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527618 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-lokistack-gateway\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527643 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/a9909b2d-93ab-4b43-b415-7d52ac5031e4-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527659 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-rbac\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527704 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/7d3963f2-7997-4cd6-8d9f-accd69aac83b-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527742 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8c1d40e2-618c-45c5-bee9-c620c977a7a5-tenants\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527830 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq2lh\" (UniqueName: \"kubernetes.io/projected/a9909b2d-93ab-4b43-b415-7d52ac5031e4-kube-api-access-pq2lh\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527850 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/7d3963f2-7997-4cd6-8d9f-accd69aac83b-tls-secret\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527873 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9909b2d-93ab-4b43-b415-7d52ac5031e4-config\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527895 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-logging-loki-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527910 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9909b2d-93ab-4b43-b415-7d52ac5031e4-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527940 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8c1d40e2-618c-45c5-bee9-c620c977a7a5-tls-secret\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527957 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-logging-loki-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527975 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.527995 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/a9909b2d-93ab-4b43-b415-7d52ac5031e4-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.528008 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-lokistack-gateway\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.529635 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9909b2d-93ab-4b43-b415-7d52ac5031e4-config\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.529926 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9909b2d-93ab-4b43-b415-7d52ac5031e4-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.531423 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/a9909b2d-93ab-4b43-b415-7d52ac5031e4-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.531741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/a9909b2d-93ab-4b43-b415-7d52ac5031e4-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.542360 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq2lh\" (UniqueName: \"kubernetes.io/projected/a9909b2d-93ab-4b43-b415-7d52ac5031e4-kube-api-access-pq2lh\") pod \"logging-loki-query-frontend-84558f7c9f-7flxb\" (UID: \"a9909b2d-93ab-4b43-b415-7d52ac5031e4\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.568894 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632217 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/7d3963f2-7997-4cd6-8d9f-accd69aac83b-tenants\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632260 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r79zv\" (UniqueName: \"kubernetes.io/projected/7d3963f2-7997-4cd6-8d9f-accd69aac83b-kube-api-access-r79zv\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632284 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-rbac\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632321 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-lokistack-gateway\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632337 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-rbac\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/7d3963f2-7997-4cd6-8d9f-accd69aac83b-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632378 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8c1d40e2-618c-45c5-bee9-c620c977a7a5-tenants\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632412 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/7d3963f2-7997-4cd6-8d9f-accd69aac83b-tls-secret\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632433 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-logging-loki-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632453 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8c1d40e2-618c-45c5-bee9-c620c977a7a5-tls-secret\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632467 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-logging-loki-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632482 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632498 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-lokistack-gateway\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632546 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dtns\" (UniqueName: \"kubernetes.io/projected/8c1d40e2-618c-45c5-bee9-c620c977a7a5-kube-api-access-8dtns\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.632574 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8c1d40e2-618c-45c5-bee9-c620c977a7a5-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.639224 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-logging-loki-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.641369 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-lokistack-gateway\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.642109 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.642902 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-rbac\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.643969 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-lokistack-gateway\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.644033 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.644430 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8c1d40e2-618c-45c5-bee9-c620c977a7a5-tenants\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.644829 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/7d3963f2-7997-4cd6-8d9f-accd69aac83b-rbac\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.648398 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c1d40e2-618c-45c5-bee9-c620c977a7a5-logging-loki-ca-bundle\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.650084 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8c1d40e2-618c-45c5-bee9-c620c977a7a5-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.651534 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8c1d40e2-618c-45c5-bee9-c620c977a7a5-tls-secret\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.664535 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/7d3963f2-7997-4cd6-8d9f-accd69aac83b-tenants\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.675054 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/7d3963f2-7997-4cd6-8d9f-accd69aac83b-tls-secret\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.682240 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/7d3963f2-7997-4cd6-8d9f-accd69aac83b-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.687405 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dtns\" (UniqueName: \"kubernetes.io/projected/8c1d40e2-618c-45c5-bee9-c620c977a7a5-kube-api-access-8dtns\") pod \"logging-loki-gateway-55c764d7cd-dftlk\" (UID: \"8c1d40e2-618c-45c5-bee9-c620c977a7a5\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.687715 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.690123 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r79zv\" (UniqueName: \"kubernetes.io/projected/7d3963f2-7997-4cd6-8d9f-accd69aac83b-kube-api-access-r79zv\") pod \"logging-loki-gateway-55c764d7cd-lvwlq\" (UID: \"7d3963f2-7997-4cd6-8d9f-accd69aac83b\") " pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.708836 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.793341 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-g7jxl"] Dec 09 15:10:30 crc kubenswrapper[4735]: I1209 15:10:30.796112 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8"] Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.014664 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" event={"ID":"b0d84c01-0813-4862-add0-1086e3ca895f","Type":"ContainerStarted","Data":"dba487d1935a238e1df996459195fd9cc2a417703b6cf99f41e2e20878ed617a"} Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.015804 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" event={"ID":"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41","Type":"ContainerStarted","Data":"5b3bb9cf9f06add7529fba48cb2a13c1aaa3492b0a198b2bb30dc78c5e9d0fa4"} Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.114068 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb"] Dec 09 15:10:31 crc kubenswrapper[4735]: W1209 15:10:31.117086 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9909b2d_93ab_4b43_b415_7d52ac5031e4.slice/crio-f29923fb287cf3fd59116fe3dc5b5bf8c9155e75532d9afc51479fde74a37363 WatchSource:0}: Error finding container f29923fb287cf3fd59116fe3dc5b5bf8c9155e75532d9afc51479fde74a37363: Status 404 returned error can't find the container with id f29923fb287cf3fd59116fe3dc5b5bf8c9155e75532d9afc51479fde74a37363 Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.151904 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-55c764d7cd-dftlk"] Dec 09 15:10:31 crc kubenswrapper[4735]: W1209 15:10:31.153158 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c1d40e2_618c_45c5_bee9_c620c977a7a5.slice/crio-b6867a70c0b8afe119f08e809e625ed8ae3a478bc6f565534eee18b7e17f8cf6 WatchSource:0}: Error finding container b6867a70c0b8afe119f08e809e625ed8ae3a478bc6f565534eee18b7e17f8cf6: Status 404 returned error can't find the container with id b6867a70c0b8afe119f08e809e625ed8ae3a478bc6f565534eee18b7e17f8cf6 Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.179700 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq"] Dec 09 15:10:31 crc kubenswrapper[4735]: W1209 15:10:31.182416 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d3963f2_7997_4cd6_8d9f_accd69aac83b.slice/crio-e9592173b6e5142d30823e0086bc3a24c6369033f44ea8f1e561900c423d9bb3 WatchSource:0}: Error finding container e9592173b6e5142d30823e0086bc3a24c6369033f44ea8f1e561900c423d9bb3: Status 404 returned error can't find the container with id e9592173b6e5142d30823e0086bc3a24c6369033f44ea8f1e561900c423d9bb3 Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.223224 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.223962 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.225641 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.226718 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.232729 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.247211 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.247982 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.249731 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.249982 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.258629 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.318730 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.319485 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.320524 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.321314 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.328463 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.340950 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a29c1478-0deb-43b5-bd7b-3ea4937ac797\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a29c1478-0deb-43b5-bd7b-3ea4937ac797\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.340999 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpsc9\" (UniqueName: \"kubernetes.io/projected/c09b376f-0220-475e-8fed-5219c7e7a147-kube-api-access-hpsc9\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341025 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh6pt\" (UniqueName: \"kubernetes.io/projected/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-kube-api-access-kh6pt\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341045 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c09b376f-0220-475e-8fed-5219c7e7a147-config\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341097 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9078e351-7210-4a21-b6ed-ab158ed16175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9078e351-7210-4a21-b6ed-ab158ed16175\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341138 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341164 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-config\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341200 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341291 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341403 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6e5377e-1886-4b13-a2df-07413cd5ad77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6e5377e-1886-4b13-a2df-07413cd5ad77\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341440 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341459 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341569 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341607 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.341633 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442632 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442668 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442715 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6e5377e-1886-4b13-a2df-07413cd5ad77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6e5377e-1886-4b13-a2df-07413cd5ad77\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442734 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442751 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442768 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442804 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5219a4b4-c24d-4ba7-ac17-99f3359dad68\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5219a4b4-c24d-4ba7-ac17-99f3359dad68\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442823 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442845 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442861 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442880 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442899 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a29c1478-0deb-43b5-bd7b-3ea4937ac797\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a29c1478-0deb-43b5-bd7b-3ea4937ac797\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442917 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ths6\" (UniqueName: \"kubernetes.io/projected/dacf0d91-16bc-4578-b424-f4524b47d537-kube-api-access-7ths6\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442940 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442957 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpsc9\" (UniqueName: \"kubernetes.io/projected/c09b376f-0220-475e-8fed-5219c7e7a147-kube-api-access-hpsc9\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442973 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.442989 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dacf0d91-16bc-4578-b424-f4524b47d537-config\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.443006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh6pt\" (UniqueName: \"kubernetes.io/projected/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-kube-api-access-kh6pt\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.443023 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c09b376f-0220-475e-8fed-5219c7e7a147-config\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.443040 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9078e351-7210-4a21-b6ed-ab158ed16175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9078e351-7210-4a21-b6ed-ab158ed16175\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.443055 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.443071 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-config\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.444312 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-config\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.445316 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c09b376f-0220-475e-8fed-5219c7e7a147-config\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.446637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.446846 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.447306 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.448210 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.448817 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.448840 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6e5377e-1886-4b13-a2df-07413cd5ad77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6e5377e-1886-4b13-a2df-07413cd5ad77\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6b8606f598e0a4113b95208297327e1160c860e1f00769ea4da1a0574228157e/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.448978 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.449048 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9078e351-7210-4a21-b6ed-ab158ed16175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9078e351-7210-4a21-b6ed-ab158ed16175\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/71c433d0e265e35259d20140b209e2bb22ba31e2ecf80ffc7c45daa06fd02dba/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.449457 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.449768 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.449888 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.449983 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a29c1478-0deb-43b5-bd7b-3ea4937ac797\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a29c1478-0deb-43b5-bd7b-3ea4937ac797\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f89479ca9e3e6dddccf161f45d3bb90bc47b8d019cc69baa8a14e64281364c43/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.450974 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.452589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/c09b376f-0220-475e-8fed-5219c7e7a147-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.460014 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpsc9\" (UniqueName: \"kubernetes.io/projected/c09b376f-0220-475e-8fed-5219c7e7a147-kube-api-access-hpsc9\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.461999 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh6pt\" (UniqueName: \"kubernetes.io/projected/a771f4c8-0d04-4c79-be0d-5b45b4b5a037-kube-api-access-kh6pt\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.467443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6e5377e-1886-4b13-a2df-07413cd5ad77\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6e5377e-1886-4b13-a2df-07413cd5ad77\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.477190 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a29c1478-0deb-43b5-bd7b-3ea4937ac797\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a29c1478-0deb-43b5-bd7b-3ea4937ac797\") pod \"logging-loki-ingester-0\" (UID: \"c09b376f-0220-475e-8fed-5219c7e7a147\") " pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.477214 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9078e351-7210-4a21-b6ed-ab158ed16175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9078e351-7210-4a21-b6ed-ab158ed16175\") pod \"logging-loki-compactor-0\" (UID: \"a771f4c8-0d04-4c79-be0d-5b45b4b5a037\") " pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.536964 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.543855 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.543899 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.543978 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dacf0d91-16bc-4578-b424-f4524b47d537-config\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.544061 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.544095 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5219a4b4-c24d-4ba7-ac17-99f3359dad68\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5219a4b4-c24d-4ba7-ac17-99f3359dad68\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.544138 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.544166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ths6\" (UniqueName: \"kubernetes.io/projected/dacf0d91-16bc-4578-b424-f4524b47d537-kube-api-access-7ths6\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.549753 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.552471 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dacf0d91-16bc-4578-b424-f4524b47d537-config\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.555432 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.558269 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.558348 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5219a4b4-c24d-4ba7-ac17-99f3359dad68\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5219a4b4-c24d-4ba7-ac17-99f3359dad68\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/350ed22c2f2b5f90eeded2b82b6f7cdac0a63bf5b79d4b27020f31c4c35afaec/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.558570 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.559055 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ths6\" (UniqueName: \"kubernetes.io/projected/dacf0d91-16bc-4578-b424-f4524b47d537-kube-api-access-7ths6\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.560022 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.563421 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/dacf0d91-16bc-4578-b424-f4524b47d537-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.580303 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5219a4b4-c24d-4ba7-ac17-99f3359dad68\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5219a4b4-c24d-4ba7-ac17-99f3359dad68\") pod \"logging-loki-index-gateway-0\" (UID: \"dacf0d91-16bc-4578-b424-f4524b47d537\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.659599 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.699504 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 09 15:10:31 crc kubenswrapper[4735]: W1209 15:10:31.704053 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc09b376f_0220_475e_8fed_5219c7e7a147.slice/crio-f75cfc5e70c6f8fb81c10b54bdc477b2bd75b7d319c96c3b361db5c16f585b74 WatchSource:0}: Error finding container f75cfc5e70c6f8fb81c10b54bdc477b2bd75b7d319c96c3b361db5c16f585b74: Status 404 returned error can't find the container with id f75cfc5e70c6f8fb81c10b54bdc477b2bd75b7d319c96c3b361db5c16f585b74 Dec 09 15:10:31 crc kubenswrapper[4735]: I1209 15:10:31.733758 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 09 15:10:31 crc kubenswrapper[4735]: W1209 15:10:31.739058 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda771f4c8_0d04_4c79_be0d_5b45b4b5a037.slice/crio-079e97c4d606afcec7c341f681e11197b6bbbedc924d51f02dfd8cdd5c73c4dc WatchSource:0}: Error finding container 079e97c4d606afcec7c341f681e11197b6bbbedc924d51f02dfd8cdd5c73c4dc: Status 404 returned error can't find the container with id 079e97c4d606afcec7c341f681e11197b6bbbedc924d51f02dfd8cdd5c73c4dc Dec 09 15:10:32 crc kubenswrapper[4735]: I1209 15:10:32.000172 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 09 15:10:32 crc kubenswrapper[4735]: W1209 15:10:32.004984 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddacf0d91_16bc_4578_b424_f4524b47d537.slice/crio-b1294295ec7e5e4207199138ec599559d084e22453ae11eae39faabd2bab1151 WatchSource:0}: Error finding container b1294295ec7e5e4207199138ec599559d084e22453ae11eae39faabd2bab1151: Status 404 returned error can't find the container with id b1294295ec7e5e4207199138ec599559d084e22453ae11eae39faabd2bab1151 Dec 09 15:10:32 crc kubenswrapper[4735]: I1209 15:10:32.021592 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" event={"ID":"a9909b2d-93ab-4b43-b415-7d52ac5031e4","Type":"ContainerStarted","Data":"f29923fb287cf3fd59116fe3dc5b5bf8c9155e75532d9afc51479fde74a37363"} Dec 09 15:10:32 crc kubenswrapper[4735]: I1209 15:10:32.022541 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"c09b376f-0220-475e-8fed-5219c7e7a147","Type":"ContainerStarted","Data":"f75cfc5e70c6f8fb81c10b54bdc477b2bd75b7d319c96c3b361db5c16f585b74"} Dec 09 15:10:32 crc kubenswrapper[4735]: I1209 15:10:32.023653 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" event={"ID":"7d3963f2-7997-4cd6-8d9f-accd69aac83b","Type":"ContainerStarted","Data":"e9592173b6e5142d30823e0086bc3a24c6369033f44ea8f1e561900c423d9bb3"} Dec 09 15:10:32 crc kubenswrapper[4735]: I1209 15:10:32.024431 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" event={"ID":"8c1d40e2-618c-45c5-bee9-c620c977a7a5","Type":"ContainerStarted","Data":"b6867a70c0b8afe119f08e809e625ed8ae3a478bc6f565534eee18b7e17f8cf6"} Dec 09 15:10:32 crc kubenswrapper[4735]: I1209 15:10:32.025128 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"dacf0d91-16bc-4578-b424-f4524b47d537","Type":"ContainerStarted","Data":"b1294295ec7e5e4207199138ec599559d084e22453ae11eae39faabd2bab1151"} Dec 09 15:10:32 crc kubenswrapper[4735]: I1209 15:10:32.025962 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"a771f4c8-0d04-4c79-be0d-5b45b4b5a037","Type":"ContainerStarted","Data":"079e97c4d606afcec7c341f681e11197b6bbbedc924d51f02dfd8cdd5c73c4dc"} Dec 09 15:10:34 crc kubenswrapper[4735]: I1209 15:10:34.335990 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:10:34 crc kubenswrapper[4735]: I1209 15:10:34.336391 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.044746 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"a771f4c8-0d04-4c79-be0d-5b45b4b5a037","Type":"ContainerStarted","Data":"2d4a030fbad8d3add7e4c39e43caa6cc773c804a0bb6a6a17ce80b2a6fcfc847"} Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.044873 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.046283 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" event={"ID":"a9909b2d-93ab-4b43-b415-7d52ac5031e4","Type":"ContainerStarted","Data":"094ce97c061890d540cdd4b2b27f23df4d7d2ed4712123243cbc26a2068a96f3"} Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.046414 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.047534 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" event={"ID":"80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41","Type":"ContainerStarted","Data":"da43ac7fc102c72a71c5facdf17287a5ac0acfe3dc9cf641e4720fe95fa8ecdc"} Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.047651 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.048817 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"c09b376f-0220-475e-8fed-5219c7e7a147","Type":"ContainerStarted","Data":"96742232bf81fb87ba2df6df6cc1e57f740f375cc3ab65a29be7fb429dea9e7a"} Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.048864 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.049943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" event={"ID":"7d3963f2-7997-4cd6-8d9f-accd69aac83b","Type":"ContainerStarted","Data":"8601254bc1fbd3fb918e37e9c08934ae335d863140733b9e61768e4124375b2f"} Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.050952 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" event={"ID":"8c1d40e2-618c-45c5-bee9-c620c977a7a5","Type":"ContainerStarted","Data":"7485c76afc78c8393a0c97d9672dbda70fb7fe2ad6891883f0a8ea62e86212c3"} Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.052060 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"dacf0d91-16bc-4578-b424-f4524b47d537","Type":"ContainerStarted","Data":"2e66db5cc4f1b28aef9396cb0bb9a661e703c52fb5d85b15d766f6cb3027a20a"} Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.052097 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.053176 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" event={"ID":"b0d84c01-0813-4862-add0-1086e3ca895f","Type":"ContainerStarted","Data":"789a3fb34b0315ca0408812a84bf26317d7b0a5e256f95c89dd4c90478c724e0"} Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.053286 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.062931 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=2.753431498 podStartE2EDuration="5.062918633s" podCreationTimestamp="2025-12-09 15:10:30 +0000 UTC" firstStartedPulling="2025-12-09 15:10:31.741246998 +0000 UTC m=+710.666085626" lastFinishedPulling="2025-12-09 15:10:34.050734134 +0000 UTC m=+712.975572761" observedRunningTime="2025-12-09 15:10:35.059745516 +0000 UTC m=+713.984584144" watchObservedRunningTime="2025-12-09 15:10:35.062918633 +0000 UTC m=+713.987757262" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.076562 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=2.743302991 podStartE2EDuration="5.076546348s" podCreationTimestamp="2025-12-09 15:10:30 +0000 UTC" firstStartedPulling="2025-12-09 15:10:31.705853305 +0000 UTC m=+710.630691933" lastFinishedPulling="2025-12-09 15:10:34.039096661 +0000 UTC m=+712.963935290" observedRunningTime="2025-12-09 15:10:35.072843837 +0000 UTC m=+713.997682464" watchObservedRunningTime="2025-12-09 15:10:35.076546348 +0000 UTC m=+714.001384976" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.093577 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" podStartSLOduration=1.868383472 podStartE2EDuration="5.093562104s" podCreationTimestamp="2025-12-09 15:10:30 +0000 UTC" firstStartedPulling="2025-12-09 15:10:30.825549 +0000 UTC m=+709.750387627" lastFinishedPulling="2025-12-09 15:10:34.050727631 +0000 UTC m=+712.975566259" observedRunningTime="2025-12-09 15:10:35.091346108 +0000 UTC m=+714.016184735" watchObservedRunningTime="2025-12-09 15:10:35.093562104 +0000 UTC m=+714.018400732" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.106427 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" podStartSLOduration=1.871940561 podStartE2EDuration="5.106415283s" podCreationTimestamp="2025-12-09 15:10:30 +0000 UTC" firstStartedPulling="2025-12-09 15:10:30.818375661 +0000 UTC m=+709.743214289" lastFinishedPulling="2025-12-09 15:10:34.052850383 +0000 UTC m=+712.977689011" observedRunningTime="2025-12-09 15:10:35.102554051 +0000 UTC m=+714.027392680" watchObservedRunningTime="2025-12-09 15:10:35.106415283 +0000 UTC m=+714.031253910" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.115915 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.071277153 podStartE2EDuration="5.1159016s" podCreationTimestamp="2025-12-09 15:10:30 +0000 UTC" firstStartedPulling="2025-12-09 15:10:32.006757445 +0000 UTC m=+710.931596073" lastFinishedPulling="2025-12-09 15:10:34.051381892 +0000 UTC m=+712.976220520" observedRunningTime="2025-12-09 15:10:35.114577792 +0000 UTC m=+714.039416419" watchObservedRunningTime="2025-12-09 15:10:35.1159016 +0000 UTC m=+714.040740218" Dec 09 15:10:35 crc kubenswrapper[4735]: I1209 15:10:35.127370 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" podStartSLOduration=2.194277872 podStartE2EDuration="5.127361018s" podCreationTimestamp="2025-12-09 15:10:30 +0000 UTC" firstStartedPulling="2025-12-09 15:10:31.118929031 +0000 UTC m=+710.043767659" lastFinishedPulling="2025-12-09 15:10:34.052012178 +0000 UTC m=+712.976850805" observedRunningTime="2025-12-09 15:10:35.124225723 +0000 UTC m=+714.049064381" watchObservedRunningTime="2025-12-09 15:10:35.127361018 +0000 UTC m=+714.052199646" Dec 09 15:10:37 crc kubenswrapper[4735]: I1209 15:10:37.065273 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" event={"ID":"7d3963f2-7997-4cd6-8d9f-accd69aac83b","Type":"ContainerStarted","Data":"77283f379368ea43ac66eec10677ff62b400e39ecd5a82883e688211a57e4884"} Dec 09 15:10:37 crc kubenswrapper[4735]: I1209 15:10:37.065499 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:37 crc kubenswrapper[4735]: I1209 15:10:37.067125 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" event={"ID":"8c1d40e2-618c-45c5-bee9-c620c977a7a5","Type":"ContainerStarted","Data":"e99ca5c5fc57415c42ea21173c2b8748f33084cf36dab8167cc2fec2b66a1dab"} Dec 09 15:10:37 crc kubenswrapper[4735]: I1209 15:10:37.067305 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:37 crc kubenswrapper[4735]: I1209 15:10:37.067382 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:37 crc kubenswrapper[4735]: I1209 15:10:37.074010 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:37 crc kubenswrapper[4735]: I1209 15:10:37.074054 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:37 crc kubenswrapper[4735]: I1209 15:10:37.075736 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" Dec 09 15:10:37 crc kubenswrapper[4735]: I1209 15:10:37.083424 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" podStartSLOduration=2.356864578 podStartE2EDuration="7.083411535s" podCreationTimestamp="2025-12-09 15:10:30 +0000 UTC" firstStartedPulling="2025-12-09 15:10:31.184089748 +0000 UTC m=+710.108928376" lastFinishedPulling="2025-12-09 15:10:35.910636704 +0000 UTC m=+714.835475333" observedRunningTime="2025-12-09 15:10:37.079761993 +0000 UTC m=+716.004600620" watchObservedRunningTime="2025-12-09 15:10:37.083411535 +0000 UTC m=+716.008250162" Dec 09 15:10:37 crc kubenswrapper[4735]: I1209 15:10:37.096290 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-55c764d7cd-dftlk" podStartSLOduration=2.344682989 podStartE2EDuration="7.096279419s" podCreationTimestamp="2025-12-09 15:10:30 +0000 UTC" firstStartedPulling="2025-12-09 15:10:31.1551923 +0000 UTC m=+710.080030928" lastFinishedPulling="2025-12-09 15:10:35.90678873 +0000 UTC m=+714.831627358" observedRunningTime="2025-12-09 15:10:37.093263318 +0000 UTC m=+716.018101956" watchObservedRunningTime="2025-12-09 15:10:37.096279419 +0000 UTC m=+716.021118047" Dec 09 15:10:38 crc kubenswrapper[4735]: I1209 15:10:38.072190 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:38 crc kubenswrapper[4735]: I1209 15:10:38.080777 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-55c764d7cd-lvwlq" Dec 09 15:10:50 crc kubenswrapper[4735]: I1209 15:10:50.394927 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-rqpx8" Dec 09 15:10:50 crc kubenswrapper[4735]: I1209 15:10:50.546748 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-g7jxl" Dec 09 15:10:50 crc kubenswrapper[4735]: I1209 15:10:50.573861 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-7flxb" Dec 09 15:10:51 crc kubenswrapper[4735]: I1209 15:10:51.543499 4735 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 09 15:10:51 crc kubenswrapper[4735]: I1209 15:10:51.544002 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c09b376f-0220-475e-8fed-5219c7e7a147" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 09 15:10:51 crc kubenswrapper[4735]: I1209 15:10:51.564620 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Dec 09 15:10:51 crc kubenswrapper[4735]: I1209 15:10:51.667393 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Dec 09 15:11:01 crc kubenswrapper[4735]: I1209 15:11:01.544325 4735 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 09 15:11:01 crc kubenswrapper[4735]: I1209 15:11:01.544743 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c09b376f-0220-475e-8fed-5219c7e7a147" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 09 15:11:04 crc kubenswrapper[4735]: I1209 15:11:04.335998 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:11:04 crc kubenswrapper[4735]: I1209 15:11:04.336229 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:11:04 crc kubenswrapper[4735]: I1209 15:11:04.336268 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 15:11:04 crc kubenswrapper[4735]: I1209 15:11:04.336744 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32f296cb608e9d91aaf8195ce2837766de47464c288698a32b6b4cd28703999c"} pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:11:04 crc kubenswrapper[4735]: I1209 15:11:04.336792 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" containerID="cri-o://32f296cb608e9d91aaf8195ce2837766de47464c288698a32b6b4cd28703999c" gracePeriod=600 Dec 09 15:11:05 crc kubenswrapper[4735]: I1209 15:11:05.221494 4735 generic.go:334] "Generic (PLEG): container finished" podID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerID="32f296cb608e9d91aaf8195ce2837766de47464c288698a32b6b4cd28703999c" exitCode=0 Dec 09 15:11:05 crc kubenswrapper[4735]: I1209 15:11:05.221546 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerDied","Data":"32f296cb608e9d91aaf8195ce2837766de47464c288698a32b6b4cd28703999c"} Dec 09 15:11:05 crc kubenswrapper[4735]: I1209 15:11:05.221886 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"a22ce376c7f734447e3e64908bb1c07a5b4ae150029c792b81dee199a1f86208"} Dec 09 15:11:05 crc kubenswrapper[4735]: I1209 15:11:05.221910 4735 scope.go:117] "RemoveContainer" containerID="7283bb739f18015c714dddde96f1ec57c18cf87c97f630bb012cb5c7d38a8190" Dec 09 15:11:07 crc kubenswrapper[4735]: I1209 15:11:07.044397 4735 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 09 15:11:11 crc kubenswrapper[4735]: I1209 15:11:11.541459 4735 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 09 15:11:11 crc kubenswrapper[4735]: I1209 15:11:11.541898 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c09b376f-0220-475e-8fed-5219c7e7a147" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 09 15:11:21 crc kubenswrapper[4735]: I1209 15:11:21.540820 4735 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 09 15:11:21 crc kubenswrapper[4735]: I1209 15:11:21.541219 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="c09b376f-0220-475e-8fed-5219c7e7a147" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 09 15:11:31 crc kubenswrapper[4735]: I1209 15:11:31.540741 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.557250 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-7b5dt"] Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.558356 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.560054 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.560151 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.561107 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-r92h9" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.561115 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.561652 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.566409 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.571282 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-7b5dt"] Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.692235 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-7b5dt"] Dec 09 15:11:50 crc kubenswrapper[4735]: E1209 15:11:50.692844 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-nld5s metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-7b5dt" podUID="05cc88d6-569b-42bb-ba73-d77abbacd276" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.721842 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config-openshift-service-cacrt\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.721948 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-entrypoint\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.721997 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nld5s\" (UniqueName: \"kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-kube-api-access-nld5s\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.722015 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-metrics\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.722094 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.722133 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05cc88d6-569b-42bb-ba73-d77abbacd276-tmp\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.722162 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-token\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.722177 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-sa-token\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.722262 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-syslog-receiver\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.722301 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-trusted-ca\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.722321 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/05cc88d6-569b-42bb-ba73-d77abbacd276-datadir\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.823436 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-syslog-receiver\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.823708 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-trusted-ca\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.823795 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/05cc88d6-569b-42bb-ba73-d77abbacd276-datadir\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.823873 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/05cc88d6-569b-42bb-ba73-d77abbacd276-datadir\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.823958 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config-openshift-service-cacrt\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.824041 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-entrypoint\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.824121 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nld5s\" (UniqueName: \"kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-kube-api-access-nld5s\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.824189 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-metrics\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.824298 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.824399 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05cc88d6-569b-42bb-ba73-d77abbacd276-tmp\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.824501 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-token\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.824576 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config-openshift-service-cacrt\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.824608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-trusted-ca\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.824728 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-sa-token\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.824771 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-entrypoint\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.825034 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.828173 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05cc88d6-569b-42bb-ba73-d77abbacd276-tmp\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.828477 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-syslog-receiver\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.828555 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-token\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.828839 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-metrics\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.837325 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nld5s\" (UniqueName: \"kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-kube-api-access-nld5s\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:50 crc kubenswrapper[4735]: I1209 15:11:50.837712 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-sa-token\") pod \"collector-7b5dt\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " pod="openshift-logging/collector-7b5dt" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.436959 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-7b5dt" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.443096 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-7b5dt" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.530668 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-sa-token\") pod \"05cc88d6-569b-42bb-ba73-d77abbacd276\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.530717 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05cc88d6-569b-42bb-ba73-d77abbacd276-tmp\") pod \"05cc88d6-569b-42bb-ba73-d77abbacd276\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.530746 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config\") pod \"05cc88d6-569b-42bb-ba73-d77abbacd276\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.530782 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-metrics\") pod \"05cc88d6-569b-42bb-ba73-d77abbacd276\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.530841 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/05cc88d6-569b-42bb-ba73-d77abbacd276-datadir\") pod \"05cc88d6-569b-42bb-ba73-d77abbacd276\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.530873 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config-openshift-service-cacrt\") pod \"05cc88d6-569b-42bb-ba73-d77abbacd276\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.530910 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-token\") pod \"05cc88d6-569b-42bb-ba73-d77abbacd276\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.530926 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-trusted-ca\") pod \"05cc88d6-569b-42bb-ba73-d77abbacd276\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.530944 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nld5s\" (UniqueName: \"kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-kube-api-access-nld5s\") pod \"05cc88d6-569b-42bb-ba73-d77abbacd276\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.530961 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-entrypoint\") pod \"05cc88d6-569b-42bb-ba73-d77abbacd276\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.530984 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-syslog-receiver\") pod \"05cc88d6-569b-42bb-ba73-d77abbacd276\" (UID: \"05cc88d6-569b-42bb-ba73-d77abbacd276\") " Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.531203 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config" (OuterVolumeSpecName: "config") pod "05cc88d6-569b-42bb-ba73-d77abbacd276" (UID: "05cc88d6-569b-42bb-ba73-d77abbacd276"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.531402 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "05cc88d6-569b-42bb-ba73-d77abbacd276" (UID: "05cc88d6-569b-42bb-ba73-d77abbacd276"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.531544 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "05cc88d6-569b-42bb-ba73-d77abbacd276" (UID: "05cc88d6-569b-42bb-ba73-d77abbacd276"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.531559 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "05cc88d6-569b-42bb-ba73-d77abbacd276" (UID: "05cc88d6-569b-42bb-ba73-d77abbacd276"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.531611 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05cc88d6-569b-42bb-ba73-d77abbacd276-datadir" (OuterVolumeSpecName: "datadir") pod "05cc88d6-569b-42bb-ba73-d77abbacd276" (UID: "05cc88d6-569b-42bb-ba73-d77abbacd276"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.533766 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05cc88d6-569b-42bb-ba73-d77abbacd276-tmp" (OuterVolumeSpecName: "tmp") pod "05cc88d6-569b-42bb-ba73-d77abbacd276" (UID: "05cc88d6-569b-42bb-ba73-d77abbacd276"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.533772 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-metrics" (OuterVolumeSpecName: "metrics") pod "05cc88d6-569b-42bb-ba73-d77abbacd276" (UID: "05cc88d6-569b-42bb-ba73-d77abbacd276"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.533799 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-sa-token" (OuterVolumeSpecName: "sa-token") pod "05cc88d6-569b-42bb-ba73-d77abbacd276" (UID: "05cc88d6-569b-42bb-ba73-d77abbacd276"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.534377 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "05cc88d6-569b-42bb-ba73-d77abbacd276" (UID: "05cc88d6-569b-42bb-ba73-d77abbacd276"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.534411 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-token" (OuterVolumeSpecName: "collector-token") pod "05cc88d6-569b-42bb-ba73-d77abbacd276" (UID: "05cc88d6-569b-42bb-ba73-d77abbacd276"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.534658 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-kube-api-access-nld5s" (OuterVolumeSpecName: "kube-api-access-nld5s") pod "05cc88d6-569b-42bb-ba73-d77abbacd276" (UID: "05cc88d6-569b-42bb-ba73-d77abbacd276"). InnerVolumeSpecName "kube-api-access-nld5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.632180 4735 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-metrics\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.632207 4735 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/05cc88d6-569b-42bb-ba73-d77abbacd276-datadir\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.632217 4735 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.632229 4735 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-token\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.632239 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.632248 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nld5s\" (UniqueName: \"kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-kube-api-access-nld5s\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.632259 4735 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-entrypoint\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.632268 4735 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/05cc88d6-569b-42bb-ba73-d77abbacd276-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.632275 4735 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/05cc88d6-569b-42bb-ba73-d77abbacd276-sa-token\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.632284 4735 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05cc88d6-569b-42bb-ba73-d77abbacd276-tmp\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:51 crc kubenswrapper[4735]: I1209 15:11:51.632291 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05cc88d6-569b-42bb-ba73-d77abbacd276-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.440869 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-7b5dt" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.474938 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-7b5dt"] Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.479298 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-7b5dt"] Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.483404 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-zw7mb"] Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.484246 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.486580 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.486754 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.486956 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-r92h9" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.487029 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.489913 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.490249 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-zw7mb"] Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.495704 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.540275 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/d504cda5-e7a2-45e3-b775-73b06f492def-collector-syslog-receiver\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.540312 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/d504cda5-e7a2-45e3-b775-73b06f492def-datadir\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.540347 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/d504cda5-e7a2-45e3-b775-73b06f492def-sa-token\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.540529 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-trusted-ca\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.540574 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/d504cda5-e7a2-45e3-b775-73b06f492def-metrics\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.540664 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/d504cda5-e7a2-45e3-b775-73b06f492def-collector-token\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.540713 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-config-openshift-service-cacrt\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.540737 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5lp7\" (UniqueName: \"kubernetes.io/projected/d504cda5-e7a2-45e3-b775-73b06f492def-kube-api-access-w5lp7\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.540801 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-entrypoint\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.540842 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-config\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.540888 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d504cda5-e7a2-45e3-b775-73b06f492def-tmp\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641281 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-entrypoint\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641319 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-config\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641347 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d504cda5-e7a2-45e3-b775-73b06f492def-tmp\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/d504cda5-e7a2-45e3-b775-73b06f492def-collector-syslog-receiver\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641378 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/d504cda5-e7a2-45e3-b775-73b06f492def-datadir\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641399 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/d504cda5-e7a2-45e3-b775-73b06f492def-sa-token\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641428 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-trusted-ca\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641444 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/d504cda5-e7a2-45e3-b775-73b06f492def-metrics\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641471 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/d504cda5-e7a2-45e3-b775-73b06f492def-collector-token\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641571 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/d504cda5-e7a2-45e3-b775-73b06f492def-datadir\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641935 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-config-openshift-service-cacrt\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.641989 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5lp7\" (UniqueName: \"kubernetes.io/projected/d504cda5-e7a2-45e3-b775-73b06f492def-kube-api-access-w5lp7\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.642265 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-entrypoint\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.642897 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-config-openshift-service-cacrt\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.642901 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-trusted-ca\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.643156 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d504cda5-e7a2-45e3-b775-73b06f492def-config\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.644016 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d504cda5-e7a2-45e3-b775-73b06f492def-tmp\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.644342 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/d504cda5-e7a2-45e3-b775-73b06f492def-collector-syslog-receiver\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.646460 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/d504cda5-e7a2-45e3-b775-73b06f492def-metrics\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.646762 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/d504cda5-e7a2-45e3-b775-73b06f492def-collector-token\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.654825 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/d504cda5-e7a2-45e3-b775-73b06f492def-sa-token\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.655224 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5lp7\" (UniqueName: \"kubernetes.io/projected/d504cda5-e7a2-45e3-b775-73b06f492def-kube-api-access-w5lp7\") pod \"collector-zw7mb\" (UID: \"d504cda5-e7a2-45e3-b775-73b06f492def\") " pod="openshift-logging/collector-zw7mb" Dec 09 15:11:52 crc kubenswrapper[4735]: I1209 15:11:52.798075 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-zw7mb" Dec 09 15:11:53 crc kubenswrapper[4735]: I1209 15:11:53.152961 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-zw7mb"] Dec 09 15:11:53 crc kubenswrapper[4735]: I1209 15:11:53.423694 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05cc88d6-569b-42bb-ba73-d77abbacd276" path="/var/lib/kubelet/pods/05cc88d6-569b-42bb-ba73-d77abbacd276/volumes" Dec 09 15:11:53 crc kubenswrapper[4735]: I1209 15:11:53.446633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-zw7mb" event={"ID":"d504cda5-e7a2-45e3-b775-73b06f492def","Type":"ContainerStarted","Data":"a1b2683a163a1454418f4ab0885e4c67003a17f61e107a305a28a26c9704ecd5"} Dec 09 15:12:01 crc kubenswrapper[4735]: I1209 15:12:01.492866 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-zw7mb" event={"ID":"d504cda5-e7a2-45e3-b775-73b06f492def","Type":"ContainerStarted","Data":"1dc8490d8acf0f25492d739aaa11dbd9f2ec176ac2df3d8b073f0584f72ef9af"} Dec 09 15:12:01 crc kubenswrapper[4735]: I1209 15:12:01.509665 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-zw7mb" podStartSLOduration=2.116821797 podStartE2EDuration="9.509644904s" podCreationTimestamp="2025-12-09 15:11:52 +0000 UTC" firstStartedPulling="2025-12-09 15:11:53.163175847 +0000 UTC m=+792.088014475" lastFinishedPulling="2025-12-09 15:12:00.555998953 +0000 UTC m=+799.480837582" observedRunningTime="2025-12-09 15:12:01.508073608 +0000 UTC m=+800.432912247" watchObservedRunningTime="2025-12-09 15:12:01.509644904 +0000 UTC m=+800.434483531" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.354039 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8"] Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.355759 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.362105 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.362942 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8"] Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.479401 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.479554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqg2m\" (UniqueName: \"kubernetes.io/projected/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-kube-api-access-mqg2m\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.479637 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.581024 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.581171 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqg2m\" (UniqueName: \"kubernetes.io/projected/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-kube-api-access-mqg2m\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.581236 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.581483 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.581694 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.598029 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqg2m\" (UniqueName: \"kubernetes.io/projected/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-kube-api-access-mqg2m\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:19 crc kubenswrapper[4735]: I1209 15:12:19.671928 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:20 crc kubenswrapper[4735]: I1209 15:12:20.129982 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8"] Dec 09 15:12:20 crc kubenswrapper[4735]: I1209 15:12:20.600720 4735 generic.go:334] "Generic (PLEG): container finished" podID="991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" containerID="1cddd41726197fc700a963f1978d7c7a93cba19de796697cf1f15515a41f7338" exitCode=0 Dec 09 15:12:20 crc kubenswrapper[4735]: I1209 15:12:20.600812 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" event={"ID":"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2","Type":"ContainerDied","Data":"1cddd41726197fc700a963f1978d7c7a93cba19de796697cf1f15515a41f7338"} Dec 09 15:12:20 crc kubenswrapper[4735]: I1209 15:12:20.601196 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" event={"ID":"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2","Type":"ContainerStarted","Data":"4a296fa4d179c8d54fea0f34dab2eca8414dc6920e7b11f631913027eab8179d"} Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.714676 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jtnjh"] Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.716111 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.724972 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtnjh"] Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.817916 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-catalog-content\") pod \"redhat-operators-jtnjh\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.818052 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsffv\" (UniqueName: \"kubernetes.io/projected/599f38bd-441c-4e5a-b74d-516ffaecbc44-kube-api-access-gsffv\") pod \"redhat-operators-jtnjh\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.818126 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-utilities\") pod \"redhat-operators-jtnjh\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.919293 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-catalog-content\") pod \"redhat-operators-jtnjh\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.919632 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsffv\" (UniqueName: \"kubernetes.io/projected/599f38bd-441c-4e5a-b74d-516ffaecbc44-kube-api-access-gsffv\") pod \"redhat-operators-jtnjh\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.919683 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-utilities\") pod \"redhat-operators-jtnjh\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.919833 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-catalog-content\") pod \"redhat-operators-jtnjh\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.920015 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-utilities\") pod \"redhat-operators-jtnjh\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:21 crc kubenswrapper[4735]: I1209 15:12:21.935255 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsffv\" (UniqueName: \"kubernetes.io/projected/599f38bd-441c-4e5a-b74d-516ffaecbc44-kube-api-access-gsffv\") pod \"redhat-operators-jtnjh\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:22 crc kubenswrapper[4735]: I1209 15:12:22.029506 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:22 crc kubenswrapper[4735]: I1209 15:12:22.250253 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtnjh"] Dec 09 15:12:22 crc kubenswrapper[4735]: W1209 15:12:22.255528 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod599f38bd_441c_4e5a_b74d_516ffaecbc44.slice/crio-e75895f766989ddd420bfdc6feb137adfab67b139c88995cec911747587a60a4 WatchSource:0}: Error finding container e75895f766989ddd420bfdc6feb137adfab67b139c88995cec911747587a60a4: Status 404 returned error can't find the container with id e75895f766989ddd420bfdc6feb137adfab67b139c88995cec911747587a60a4 Dec 09 15:12:22 crc kubenswrapper[4735]: I1209 15:12:22.614726 4735 generic.go:334] "Generic (PLEG): container finished" podID="599f38bd-441c-4e5a-b74d-516ffaecbc44" containerID="7dfc455ed4e29c30e571a366c394a0b148e311770c1d0b9f41f7050262501ac7" exitCode=0 Dec 09 15:12:22 crc kubenswrapper[4735]: I1209 15:12:22.614852 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtnjh" event={"ID":"599f38bd-441c-4e5a-b74d-516ffaecbc44","Type":"ContainerDied","Data":"7dfc455ed4e29c30e571a366c394a0b148e311770c1d0b9f41f7050262501ac7"} Dec 09 15:12:22 crc kubenswrapper[4735]: I1209 15:12:22.615111 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtnjh" event={"ID":"599f38bd-441c-4e5a-b74d-516ffaecbc44","Type":"ContainerStarted","Data":"e75895f766989ddd420bfdc6feb137adfab67b139c88995cec911747587a60a4"} Dec 09 15:12:22 crc kubenswrapper[4735]: I1209 15:12:22.617081 4735 generic.go:334] "Generic (PLEG): container finished" podID="991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" containerID="3c00c4ca18ab894bbd6a542360c3b060822471a9fb9f6c4257d0fc0954da5bd9" exitCode=0 Dec 09 15:12:22 crc kubenswrapper[4735]: I1209 15:12:22.617122 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" event={"ID":"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2","Type":"ContainerDied","Data":"3c00c4ca18ab894bbd6a542360c3b060822471a9fb9f6c4257d0fc0954da5bd9"} Dec 09 15:12:23 crc kubenswrapper[4735]: I1209 15:12:23.625070 4735 generic.go:334] "Generic (PLEG): container finished" podID="991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" containerID="64c1f242291ee1318aaf20ab9affe8ef7125320ad0ad57dbe1e440927a7c1307" exitCode=0 Dec 09 15:12:23 crc kubenswrapper[4735]: I1209 15:12:23.625168 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" event={"ID":"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2","Type":"ContainerDied","Data":"64c1f242291ee1318aaf20ab9affe8ef7125320ad0ad57dbe1e440927a7c1307"} Dec 09 15:12:23 crc kubenswrapper[4735]: I1209 15:12:23.627013 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtnjh" event={"ID":"599f38bd-441c-4e5a-b74d-516ffaecbc44","Type":"ContainerStarted","Data":"26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99"} Dec 09 15:12:24 crc kubenswrapper[4735]: I1209 15:12:24.634056 4735 generic.go:334] "Generic (PLEG): container finished" podID="599f38bd-441c-4e5a-b74d-516ffaecbc44" containerID="26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99" exitCode=0 Dec 09 15:12:24 crc kubenswrapper[4735]: I1209 15:12:24.634166 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtnjh" event={"ID":"599f38bd-441c-4e5a-b74d-516ffaecbc44","Type":"ContainerDied","Data":"26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99"} Dec 09 15:12:24 crc kubenswrapper[4735]: I1209 15:12:24.885016 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:24 crc kubenswrapper[4735]: I1209 15:12:24.961946 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqg2m\" (UniqueName: \"kubernetes.io/projected/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-kube-api-access-mqg2m\") pod \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " Dec 09 15:12:24 crc kubenswrapper[4735]: I1209 15:12:24.962278 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-bundle\") pod \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " Dec 09 15:12:24 crc kubenswrapper[4735]: I1209 15:12:24.962394 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-util\") pod \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\" (UID: \"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2\") " Dec 09 15:12:24 crc kubenswrapper[4735]: I1209 15:12:24.962855 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-bundle" (OuterVolumeSpecName: "bundle") pod "991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" (UID: "991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:12:24 crc kubenswrapper[4735]: I1209 15:12:24.967717 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-kube-api-access-mqg2m" (OuterVolumeSpecName: "kube-api-access-mqg2m") pod "991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" (UID: "991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2"). InnerVolumeSpecName "kube-api-access-mqg2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:12:24 crc kubenswrapper[4735]: I1209 15:12:24.972327 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-util" (OuterVolumeSpecName: "util") pod "991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" (UID: "991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:12:25 crc kubenswrapper[4735]: I1209 15:12:25.065141 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:25 crc kubenswrapper[4735]: I1209 15:12:25.065176 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-util\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:25 crc kubenswrapper[4735]: I1209 15:12:25.065187 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqg2m\" (UniqueName: \"kubernetes.io/projected/991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2-kube-api-access-mqg2m\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:25 crc kubenswrapper[4735]: I1209 15:12:25.641723 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" event={"ID":"991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2","Type":"ContainerDied","Data":"4a296fa4d179c8d54fea0f34dab2eca8414dc6920e7b11f631913027eab8179d"} Dec 09 15:12:25 crc kubenswrapper[4735]: I1209 15:12:25.641780 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a296fa4d179c8d54fea0f34dab2eca8414dc6920e7b11f631913027eab8179d" Dec 09 15:12:25 crc kubenswrapper[4735]: I1209 15:12:25.641751 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8" Dec 09 15:12:25 crc kubenswrapper[4735]: I1209 15:12:25.643778 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtnjh" event={"ID":"599f38bd-441c-4e5a-b74d-516ffaecbc44","Type":"ContainerStarted","Data":"b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347"} Dec 09 15:12:25 crc kubenswrapper[4735]: I1209 15:12:25.659100 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jtnjh" podStartSLOduration=2.125618183 podStartE2EDuration="4.659090434s" podCreationTimestamp="2025-12-09 15:12:21 +0000 UTC" firstStartedPulling="2025-12-09 15:12:22.616914603 +0000 UTC m=+821.541753230" lastFinishedPulling="2025-12-09 15:12:25.150386853 +0000 UTC m=+824.075225481" observedRunningTime="2025-12-09 15:12:25.656336005 +0000 UTC m=+824.581174633" watchObservedRunningTime="2025-12-09 15:12:25.659090434 +0000 UTC m=+824.583929062" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.318596 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5wslg"] Dec 09 15:12:29 crc kubenswrapper[4735]: E1209 15:12:29.319060 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" containerName="util" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.319072 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" containerName="util" Dec 09 15:12:29 crc kubenswrapper[4735]: E1209 15:12:29.319080 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" containerName="extract" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.319086 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" containerName="extract" Dec 09 15:12:29 crc kubenswrapper[4735]: E1209 15:12:29.319101 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" containerName="pull" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.319107 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" containerName="pull" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.319219 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2" containerName="extract" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.319660 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5wslg" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.322979 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.323174 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-982zm" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.323332 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.329134 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5wslg"] Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.419180 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9gcj\" (UniqueName: \"kubernetes.io/projected/3f70f38a-2464-434f-b617-e93a1ad19c15-kube-api-access-w9gcj\") pod \"nmstate-operator-5b5b58f5c8-5wslg\" (UID: \"3f70f38a-2464-434f-b617-e93a1ad19c15\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5wslg" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.520834 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9gcj\" (UniqueName: \"kubernetes.io/projected/3f70f38a-2464-434f-b617-e93a1ad19c15-kube-api-access-w9gcj\") pod \"nmstate-operator-5b5b58f5c8-5wslg\" (UID: \"3f70f38a-2464-434f-b617-e93a1ad19c15\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5wslg" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.536882 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9gcj\" (UniqueName: \"kubernetes.io/projected/3f70f38a-2464-434f-b617-e93a1ad19c15-kube-api-access-w9gcj\") pod \"nmstate-operator-5b5b58f5c8-5wslg\" (UID: \"3f70f38a-2464-434f-b617-e93a1ad19c15\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5wslg" Dec 09 15:12:29 crc kubenswrapper[4735]: I1209 15:12:29.634218 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5wslg" Dec 09 15:12:30 crc kubenswrapper[4735]: I1209 15:12:30.074130 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5wslg"] Dec 09 15:12:30 crc kubenswrapper[4735]: W1209 15:12:30.079012 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f70f38a_2464_434f_b617_e93a1ad19c15.slice/crio-23911567c9a18ba5dac76db39ab5feb3e0eea3e08127c9ff0a6b3c4fabf82bca WatchSource:0}: Error finding container 23911567c9a18ba5dac76db39ab5feb3e0eea3e08127c9ff0a6b3c4fabf82bca: Status 404 returned error can't find the container with id 23911567c9a18ba5dac76db39ab5feb3e0eea3e08127c9ff0a6b3c4fabf82bca Dec 09 15:12:30 crc kubenswrapper[4735]: I1209 15:12:30.670802 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5wslg" event={"ID":"3f70f38a-2464-434f-b617-e93a1ad19c15","Type":"ContainerStarted","Data":"23911567c9a18ba5dac76db39ab5feb3e0eea3e08127c9ff0a6b3c4fabf82bca"} Dec 09 15:12:32 crc kubenswrapper[4735]: I1209 15:12:32.030320 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:32 crc kubenswrapper[4735]: I1209 15:12:32.030736 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:32 crc kubenswrapper[4735]: I1209 15:12:32.060436 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:32 crc kubenswrapper[4735]: I1209 15:12:32.682383 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5wslg" event={"ID":"3f70f38a-2464-434f-b617-e93a1ad19c15","Type":"ContainerStarted","Data":"63387e14478a90bb17ddd6f3b6c9feeb0529d8bad0bfba58ae1e9d1160295cbd"} Dec 09 15:12:32 crc kubenswrapper[4735]: I1209 15:12:32.694490 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5wslg" podStartSLOduration=1.9439991220000001 podStartE2EDuration="3.694479125s" podCreationTimestamp="2025-12-09 15:12:29 +0000 UTC" firstStartedPulling="2025-12-09 15:12:30.080944972 +0000 UTC m=+829.005783600" lastFinishedPulling="2025-12-09 15:12:31.831424975 +0000 UTC m=+830.756263603" observedRunningTime="2025-12-09 15:12:32.69337495 +0000 UTC m=+831.618213577" watchObservedRunningTime="2025-12-09 15:12:32.694479125 +0000 UTC m=+831.619317753" Dec 09 15:12:32 crc kubenswrapper[4735]: I1209 15:12:32.711579 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:34 crc kubenswrapper[4735]: I1209 15:12:34.307470 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtnjh"] Dec 09 15:12:34 crc kubenswrapper[4735]: I1209 15:12:34.691363 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jtnjh" podUID="599f38bd-441c-4e5a-b74d-516ffaecbc44" containerName="registry-server" containerID="cri-o://b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347" gracePeriod=2 Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.105244 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.297804 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsffv\" (UniqueName: \"kubernetes.io/projected/599f38bd-441c-4e5a-b74d-516ffaecbc44-kube-api-access-gsffv\") pod \"599f38bd-441c-4e5a-b74d-516ffaecbc44\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.297878 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-utilities\") pod \"599f38bd-441c-4e5a-b74d-516ffaecbc44\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.297917 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-catalog-content\") pod \"599f38bd-441c-4e5a-b74d-516ffaecbc44\" (UID: \"599f38bd-441c-4e5a-b74d-516ffaecbc44\") " Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.298631 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-utilities" (OuterVolumeSpecName: "utilities") pod "599f38bd-441c-4e5a-b74d-516ffaecbc44" (UID: "599f38bd-441c-4e5a-b74d-516ffaecbc44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.302060 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/599f38bd-441c-4e5a-b74d-516ffaecbc44-kube-api-access-gsffv" (OuterVolumeSpecName: "kube-api-access-gsffv") pod "599f38bd-441c-4e5a-b74d-516ffaecbc44" (UID: "599f38bd-441c-4e5a-b74d-516ffaecbc44"). InnerVolumeSpecName "kube-api-access-gsffv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.370791 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "599f38bd-441c-4e5a-b74d-516ffaecbc44" (UID: "599f38bd-441c-4e5a-b74d-516ffaecbc44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.399667 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsffv\" (UniqueName: \"kubernetes.io/projected/599f38bd-441c-4e5a-b74d-516ffaecbc44-kube-api-access-gsffv\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.399701 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.399713 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/599f38bd-441c-4e5a-b74d-516ffaecbc44-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.698452 4735 generic.go:334] "Generic (PLEG): container finished" podID="599f38bd-441c-4e5a-b74d-516ffaecbc44" containerID="b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347" exitCode=0 Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.698491 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtnjh" event={"ID":"599f38bd-441c-4e5a-b74d-516ffaecbc44","Type":"ContainerDied","Data":"b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347"} Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.698530 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtnjh" event={"ID":"599f38bd-441c-4e5a-b74d-516ffaecbc44","Type":"ContainerDied","Data":"e75895f766989ddd420bfdc6feb137adfab67b139c88995cec911747587a60a4"} Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.698527 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtnjh" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.698542 4735 scope.go:117] "RemoveContainer" containerID="b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.715674 4735 scope.go:117] "RemoveContainer" containerID="26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.720975 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtnjh"] Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.728937 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jtnjh"] Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.732351 4735 scope.go:117] "RemoveContainer" containerID="7dfc455ed4e29c30e571a366c394a0b148e311770c1d0b9f41f7050262501ac7" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.750390 4735 scope.go:117] "RemoveContainer" containerID="b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347" Dec 09 15:12:35 crc kubenswrapper[4735]: E1209 15:12:35.750826 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347\": container with ID starting with b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347 not found: ID does not exist" containerID="b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.750855 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347"} err="failed to get container status \"b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347\": rpc error: code = NotFound desc = could not find container \"b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347\": container with ID starting with b29ed67ed2f7e8ea764c91efc45873398c054963063a9295fd85f0cec27c9347 not found: ID does not exist" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.750874 4735 scope.go:117] "RemoveContainer" containerID="26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99" Dec 09 15:12:35 crc kubenswrapper[4735]: E1209 15:12:35.751114 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99\": container with ID starting with 26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99 not found: ID does not exist" containerID="26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.751133 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99"} err="failed to get container status \"26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99\": rpc error: code = NotFound desc = could not find container \"26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99\": container with ID starting with 26e8dfa64a930ea07b8320cc00d77f86ce00be09541aa5162d65019c4ae4fb99 not found: ID does not exist" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.751145 4735 scope.go:117] "RemoveContainer" containerID="7dfc455ed4e29c30e571a366c394a0b148e311770c1d0b9f41f7050262501ac7" Dec 09 15:12:35 crc kubenswrapper[4735]: E1209 15:12:35.751430 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dfc455ed4e29c30e571a366c394a0b148e311770c1d0b9f41f7050262501ac7\": container with ID starting with 7dfc455ed4e29c30e571a366c394a0b148e311770c1d0b9f41f7050262501ac7 not found: ID does not exist" containerID="7dfc455ed4e29c30e571a366c394a0b148e311770c1d0b9f41f7050262501ac7" Dec 09 15:12:35 crc kubenswrapper[4735]: I1209 15:12:35.751507 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dfc455ed4e29c30e571a366c394a0b148e311770c1d0b9f41f7050262501ac7"} err="failed to get container status \"7dfc455ed4e29c30e571a366c394a0b148e311770c1d0b9f41f7050262501ac7\": rpc error: code = NotFound desc = could not find container \"7dfc455ed4e29c30e571a366c394a0b148e311770c1d0b9f41f7050262501ac7\": container with ID starting with 7dfc455ed4e29c30e571a366c394a0b148e311770c1d0b9f41f7050262501ac7 not found: ID does not exist" Dec 09 15:12:37 crc kubenswrapper[4735]: I1209 15:12:37.422201 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="599f38bd-441c-4e5a-b74d-516ffaecbc44" path="/var/lib/kubelet/pods/599f38bd-441c-4e5a-b74d-516ffaecbc44/volumes" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.141556 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bc572"] Dec 09 15:12:40 crc kubenswrapper[4735]: E1209 15:12:40.142010 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599f38bd-441c-4e5a-b74d-516ffaecbc44" containerName="registry-server" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.142021 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="599f38bd-441c-4e5a-b74d-516ffaecbc44" containerName="registry-server" Dec 09 15:12:40 crc kubenswrapper[4735]: E1209 15:12:40.142031 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599f38bd-441c-4e5a-b74d-516ffaecbc44" containerName="extract-utilities" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.142036 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="599f38bd-441c-4e5a-b74d-516ffaecbc44" containerName="extract-utilities" Dec 09 15:12:40 crc kubenswrapper[4735]: E1209 15:12:40.142046 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="599f38bd-441c-4e5a-b74d-516ffaecbc44" containerName="extract-content" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.142052 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="599f38bd-441c-4e5a-b74d-516ffaecbc44" containerName="extract-content" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.142171 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="599f38bd-441c-4e5a-b74d-516ffaecbc44" containerName="registry-server" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.142818 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bc572" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.144582 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-4qkm8" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.150721 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bc572"] Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.157764 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzsr8\" (UniqueName: \"kubernetes.io/projected/ea111a0c-0995-4059-bd26-5d8b93e9f40c-kube-api-access-gzsr8\") pod \"nmstate-metrics-7f946cbc9-bc572\" (UID: \"ea111a0c-0995-4059-bd26-5d8b93e9f40c\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bc572" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.165328 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-gkqpg"] Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.166467 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.182598 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv"] Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.183253 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.186442 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.208554 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv"] Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.242860 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx"] Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.243711 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.245085 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.245497 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-thpvp" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.246156 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.251116 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx"] Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.258705 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzsr8\" (UniqueName: \"kubernetes.io/projected/ea111a0c-0995-4059-bd26-5d8b93e9f40c-kube-api-access-gzsr8\") pod \"nmstate-metrics-7f946cbc9-bc572\" (UID: \"ea111a0c-0995-4059-bd26-5d8b93e9f40c\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bc572" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.258743 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/eb588389-ddd7-4e1a-b514-16bb44dfb935-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-jrtqv\" (UID: \"eb588389-ddd7-4e1a-b514-16bb44dfb935\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.258806 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/710b5a96-ea7b-4fef-9a9a-e96f8d055709-nmstate-lock\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.258923 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b8e4e56-8472-4c79-8885-7e4b5cdb182f-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bhbpx\" (UID: \"8b8e4e56-8472-4c79-8885-7e4b5cdb182f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.258970 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j64x9\" (UniqueName: \"kubernetes.io/projected/eb588389-ddd7-4e1a-b514-16bb44dfb935-kube-api-access-j64x9\") pod \"nmstate-webhook-5f6d4c5ccb-jrtqv\" (UID: \"eb588389-ddd7-4e1a-b514-16bb44dfb935\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.259032 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2mvt\" (UniqueName: \"kubernetes.io/projected/710b5a96-ea7b-4fef-9a9a-e96f8d055709-kube-api-access-k2mvt\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.259149 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-858gq\" (UniqueName: \"kubernetes.io/projected/8b8e4e56-8472-4c79-8885-7e4b5cdb182f-kube-api-access-858gq\") pod \"nmstate-console-plugin-7fbb5f6569-bhbpx\" (UID: \"8b8e4e56-8472-4c79-8885-7e4b5cdb182f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.259244 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/710b5a96-ea7b-4fef-9a9a-e96f8d055709-dbus-socket\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.259271 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/710b5a96-ea7b-4fef-9a9a-e96f8d055709-ovs-socket\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.259298 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b8e4e56-8472-4c79-8885-7e4b5cdb182f-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bhbpx\" (UID: \"8b8e4e56-8472-4c79-8885-7e4b5cdb182f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.273442 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzsr8\" (UniqueName: \"kubernetes.io/projected/ea111a0c-0995-4059-bd26-5d8b93e9f40c-kube-api-access-gzsr8\") pod \"nmstate-metrics-7f946cbc9-bc572\" (UID: \"ea111a0c-0995-4059-bd26-5d8b93e9f40c\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bc572" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.359806 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b8e4e56-8472-4c79-8885-7e4b5cdb182f-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bhbpx\" (UID: \"8b8e4e56-8472-4c79-8885-7e4b5cdb182f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.359845 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j64x9\" (UniqueName: \"kubernetes.io/projected/eb588389-ddd7-4e1a-b514-16bb44dfb935-kube-api-access-j64x9\") pod \"nmstate-webhook-5f6d4c5ccb-jrtqv\" (UID: \"eb588389-ddd7-4e1a-b514-16bb44dfb935\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.359879 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2mvt\" (UniqueName: \"kubernetes.io/projected/710b5a96-ea7b-4fef-9a9a-e96f8d055709-kube-api-access-k2mvt\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.359920 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-858gq\" (UniqueName: \"kubernetes.io/projected/8b8e4e56-8472-4c79-8885-7e4b5cdb182f-kube-api-access-858gq\") pod \"nmstate-console-plugin-7fbb5f6569-bhbpx\" (UID: \"8b8e4e56-8472-4c79-8885-7e4b5cdb182f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.359959 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/710b5a96-ea7b-4fef-9a9a-e96f8d055709-dbus-socket\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.359975 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/710b5a96-ea7b-4fef-9a9a-e96f8d055709-ovs-socket\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.359991 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b8e4e56-8472-4c79-8885-7e4b5cdb182f-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bhbpx\" (UID: \"8b8e4e56-8472-4c79-8885-7e4b5cdb182f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.360016 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/eb588389-ddd7-4e1a-b514-16bb44dfb935-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-jrtqv\" (UID: \"eb588389-ddd7-4e1a-b514-16bb44dfb935\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.360031 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/710b5a96-ea7b-4fef-9a9a-e96f8d055709-nmstate-lock\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.360082 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/710b5a96-ea7b-4fef-9a9a-e96f8d055709-nmstate-lock\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.360111 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/710b5a96-ea7b-4fef-9a9a-e96f8d055709-ovs-socket\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.360484 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/710b5a96-ea7b-4fef-9a9a-e96f8d055709-dbus-socket\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.360573 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b8e4e56-8472-4c79-8885-7e4b5cdb182f-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bhbpx\" (UID: \"8b8e4e56-8472-4c79-8885-7e4b5cdb182f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.363941 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b8e4e56-8472-4c79-8885-7e4b5cdb182f-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bhbpx\" (UID: \"8b8e4e56-8472-4c79-8885-7e4b5cdb182f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.364757 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/eb588389-ddd7-4e1a-b514-16bb44dfb935-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-jrtqv\" (UID: \"eb588389-ddd7-4e1a-b514-16bb44dfb935\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.376549 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2mvt\" (UniqueName: \"kubernetes.io/projected/710b5a96-ea7b-4fef-9a9a-e96f8d055709-kube-api-access-k2mvt\") pod \"nmstate-handler-gkqpg\" (UID: \"710b5a96-ea7b-4fef-9a9a-e96f8d055709\") " pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.379477 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-858gq\" (UniqueName: \"kubernetes.io/projected/8b8e4e56-8472-4c79-8885-7e4b5cdb182f-kube-api-access-858gq\") pod \"nmstate-console-plugin-7fbb5f6569-bhbpx\" (UID: \"8b8e4e56-8472-4c79-8885-7e4b5cdb182f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.380480 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j64x9\" (UniqueName: \"kubernetes.io/projected/eb588389-ddd7-4e1a-b514-16bb44dfb935-kube-api-access-j64x9\") pod \"nmstate-webhook-5f6d4c5ccb-jrtqv\" (UID: \"eb588389-ddd7-4e1a-b514-16bb44dfb935\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.415760 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-cb789c5f6-k2kzk"] Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.416480 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.428985 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cb789c5f6-k2kzk"] Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.461049 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-oauth-serving-cert\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.461174 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-trusted-ca-bundle\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.461243 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-console-config\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.461271 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c35073c-9d85-429c-8ef2-4eabbbc793b0-console-oauth-config\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.461366 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nb9t\" (UniqueName: \"kubernetes.io/projected/3c35073c-9d85-429c-8ef2-4eabbbc793b0-kube-api-access-5nb9t\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.461391 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c35073c-9d85-429c-8ef2-4eabbbc793b0-console-serving-cert\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.461775 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-service-ca\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.463581 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bc572" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.477818 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.499319 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.562630 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.562720 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-service-ca\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.562752 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-oauth-serving-cert\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.562778 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-trusted-ca-bundle\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.562796 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-console-config\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.562817 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c35073c-9d85-429c-8ef2-4eabbbc793b0-console-oauth-config\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.562852 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c35073c-9d85-429c-8ef2-4eabbbc793b0-console-serving-cert\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.562868 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nb9t\" (UniqueName: \"kubernetes.io/projected/3c35073c-9d85-429c-8ef2-4eabbbc793b0-kube-api-access-5nb9t\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.563757 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-trusted-ca-bundle\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.563758 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-oauth-serving-cert\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.563755 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-console-config\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.565295 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c35073c-9d85-429c-8ef2-4eabbbc793b0-service-ca\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.567681 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c35073c-9d85-429c-8ef2-4eabbbc793b0-console-serving-cert\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.567754 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c35073c-9d85-429c-8ef2-4eabbbc793b0-console-oauth-config\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.576440 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nb9t\" (UniqueName: \"kubernetes.io/projected/3c35073c-9d85-429c-8ef2-4eabbbc793b0-kube-api-access-5nb9t\") pod \"console-cb789c5f6-k2kzk\" (UID: \"3c35073c-9d85-429c-8ef2-4eabbbc793b0\") " pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.725105 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gkqpg" event={"ID":"710b5a96-ea7b-4fef-9a9a-e96f8d055709","Type":"ContainerStarted","Data":"099517d6c786c871755903d2c4033e43ae8071ba53e3d878a6394f1ad0d3d841"} Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.731854 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:40 crc kubenswrapper[4735]: W1209 15:12:40.829996 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea111a0c_0995_4059_bd26_5d8b93e9f40c.slice/crio-618e607cbb28b2be94bd8bace99982a5dcb60876aa9466495ad69902b498e790 WatchSource:0}: Error finding container 618e607cbb28b2be94bd8bace99982a5dcb60876aa9466495ad69902b498e790: Status 404 returned error can't find the container with id 618e607cbb28b2be94bd8bace99982a5dcb60876aa9466495ad69902b498e790 Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.831243 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-bc572"] Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.931320 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv"] Dec 09 15:12:40 crc kubenswrapper[4735]: W1209 15:12:40.937911 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb588389_ddd7_4e1a_b514_16bb44dfb935.slice/crio-e397db0da78026b7e6ddaa2d10cd21e643f480605a8ffe08ec9a0ce41aadaf2c WatchSource:0}: Error finding container e397db0da78026b7e6ddaa2d10cd21e643f480605a8ffe08ec9a0ce41aadaf2c: Status 404 returned error can't find the container with id e397db0da78026b7e6ddaa2d10cd21e643f480605a8ffe08ec9a0ce41aadaf2c Dec 09 15:12:40 crc kubenswrapper[4735]: I1209 15:12:40.963070 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx"] Dec 09 15:12:40 crc kubenswrapper[4735]: W1209 15:12:40.968613 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b8e4e56_8472_4c79_8885_7e4b5cdb182f.slice/crio-e3c1efe605ba3a5e1b31aa8bbd0ae62e22e429d7e9384d3b30265b51a4981948 WatchSource:0}: Error finding container e3c1efe605ba3a5e1b31aa8bbd0ae62e22e429d7e9384d3b30265b51a4981948: Status 404 returned error can't find the container with id e3c1efe605ba3a5e1b31aa8bbd0ae62e22e429d7e9384d3b30265b51a4981948 Dec 09 15:12:41 crc kubenswrapper[4735]: I1209 15:12:41.078492 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cb789c5f6-k2kzk"] Dec 09 15:12:41 crc kubenswrapper[4735]: W1209 15:12:41.081377 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c35073c_9d85_429c_8ef2_4eabbbc793b0.slice/crio-1c89eb389e1139805b81f1612eab7aaf2bef7412a8cf42ac5c6e239614d5f75e WatchSource:0}: Error finding container 1c89eb389e1139805b81f1612eab7aaf2bef7412a8cf42ac5c6e239614d5f75e: Status 404 returned error can't find the container with id 1c89eb389e1139805b81f1612eab7aaf2bef7412a8cf42ac5c6e239614d5f75e Dec 09 15:12:41 crc kubenswrapper[4735]: I1209 15:12:41.732336 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" event={"ID":"8b8e4e56-8472-4c79-8885-7e4b5cdb182f","Type":"ContainerStarted","Data":"e3c1efe605ba3a5e1b31aa8bbd0ae62e22e429d7e9384d3b30265b51a4981948"} Dec 09 15:12:41 crc kubenswrapper[4735]: I1209 15:12:41.733293 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bc572" event={"ID":"ea111a0c-0995-4059-bd26-5d8b93e9f40c","Type":"ContainerStarted","Data":"618e607cbb28b2be94bd8bace99982a5dcb60876aa9466495ad69902b498e790"} Dec 09 15:12:41 crc kubenswrapper[4735]: I1209 15:12:41.734626 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" event={"ID":"eb588389-ddd7-4e1a-b514-16bb44dfb935","Type":"ContainerStarted","Data":"e397db0da78026b7e6ddaa2d10cd21e643f480605a8ffe08ec9a0ce41aadaf2c"} Dec 09 15:12:41 crc kubenswrapper[4735]: I1209 15:12:41.736841 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cb789c5f6-k2kzk" event={"ID":"3c35073c-9d85-429c-8ef2-4eabbbc793b0","Type":"ContainerStarted","Data":"d7cd3d696e9899c62279a2be136383c319932a05139947ccf17eddfc23a8f9ef"} Dec 09 15:12:41 crc kubenswrapper[4735]: I1209 15:12:41.736866 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cb789c5f6-k2kzk" event={"ID":"3c35073c-9d85-429c-8ef2-4eabbbc793b0","Type":"ContainerStarted","Data":"1c89eb389e1139805b81f1612eab7aaf2bef7412a8cf42ac5c6e239614d5f75e"} Dec 09 15:12:41 crc kubenswrapper[4735]: I1209 15:12:41.756013 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cb789c5f6-k2kzk" podStartSLOduration=1.755999708 podStartE2EDuration="1.755999708s" podCreationTimestamp="2025-12-09 15:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:12:41.749708249 +0000 UTC m=+840.674546877" watchObservedRunningTime="2025-12-09 15:12:41.755999708 +0000 UTC m=+840.680838336" Dec 09 15:12:43 crc kubenswrapper[4735]: I1209 15:12:43.748920 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bc572" event={"ID":"ea111a0c-0995-4059-bd26-5d8b93e9f40c","Type":"ContainerStarted","Data":"fbff4fd3422833c45bad748eb8ce31ab7f07cd21e1b2654e088dbcf87027cf15"} Dec 09 15:12:43 crc kubenswrapper[4735]: I1209 15:12:43.749943 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" event={"ID":"eb588389-ddd7-4e1a-b514-16bb44dfb935","Type":"ContainerStarted","Data":"84545f74b631cbf7a16968925c6cb49c8690362d543ee697356c124d89ac7a5b"} Dec 09 15:12:43 crc kubenswrapper[4735]: I1209 15:12:43.750125 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" Dec 09 15:12:43 crc kubenswrapper[4735]: I1209 15:12:43.751589 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gkqpg" event={"ID":"710b5a96-ea7b-4fef-9a9a-e96f8d055709","Type":"ContainerStarted","Data":"599d745df16161ef89d7d6360f75ed8289635441cfc78163ab221efb9f41ab3b"} Dec 09 15:12:43 crc kubenswrapper[4735]: I1209 15:12:43.751730 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:43 crc kubenswrapper[4735]: I1209 15:12:43.753673 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" event={"ID":"8b8e4e56-8472-4c79-8885-7e4b5cdb182f","Type":"ContainerStarted","Data":"f7dfdca2f32d35d1420e131709f0b25febb642b3bce72492a788bccf61d2b7c9"} Dec 09 15:12:43 crc kubenswrapper[4735]: I1209 15:12:43.771789 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" podStartSLOduration=1.258247207 podStartE2EDuration="3.771778615s" podCreationTimestamp="2025-12-09 15:12:40 +0000 UTC" firstStartedPulling="2025-12-09 15:12:40.939565593 +0000 UTC m=+839.864404221" lastFinishedPulling="2025-12-09 15:12:43.453097002 +0000 UTC m=+842.377935629" observedRunningTime="2025-12-09 15:12:43.763246994 +0000 UTC m=+842.688085622" watchObservedRunningTime="2025-12-09 15:12:43.771778615 +0000 UTC m=+842.696617243" Dec 09 15:12:43 crc kubenswrapper[4735]: I1209 15:12:43.790591 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-gkqpg" podStartSLOduration=0.852860351 podStartE2EDuration="3.790576289s" podCreationTimestamp="2025-12-09 15:12:40 +0000 UTC" firstStartedPulling="2025-12-09 15:12:40.495265843 +0000 UTC m=+839.420104471" lastFinishedPulling="2025-12-09 15:12:43.432981782 +0000 UTC m=+842.357820409" observedRunningTime="2025-12-09 15:12:43.787490876 +0000 UTC m=+842.712329505" watchObservedRunningTime="2025-12-09 15:12:43.790576289 +0000 UTC m=+842.715414916" Dec 09 15:12:43 crc kubenswrapper[4735]: I1209 15:12:43.803470 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bhbpx" podStartSLOduration=1.340876126 podStartE2EDuration="3.803455624s" podCreationTimestamp="2025-12-09 15:12:40 +0000 UTC" firstStartedPulling="2025-12-09 15:12:40.970336478 +0000 UTC m=+839.895175107" lastFinishedPulling="2025-12-09 15:12:43.432915977 +0000 UTC m=+842.357754605" observedRunningTime="2025-12-09 15:12:43.800767069 +0000 UTC m=+842.725605697" watchObservedRunningTime="2025-12-09 15:12:43.803455624 +0000 UTC m=+842.728294251" Dec 09 15:12:45 crc kubenswrapper[4735]: I1209 15:12:45.766108 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bc572" event={"ID":"ea111a0c-0995-4059-bd26-5d8b93e9f40c","Type":"ContainerStarted","Data":"3a854491fe52d8e503344a4385364394d4d0f364ba7a793e3a38bd12aa5c4f3f"} Dec 09 15:12:45 crc kubenswrapper[4735]: I1209 15:12:45.779180 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-bc572" podStartSLOduration=1.291461518 podStartE2EDuration="5.779165314s" podCreationTimestamp="2025-12-09 15:12:40 +0000 UTC" firstStartedPulling="2025-12-09 15:12:40.831931971 +0000 UTC m=+839.756770599" lastFinishedPulling="2025-12-09 15:12:45.319635767 +0000 UTC m=+844.244474395" observedRunningTime="2025-12-09 15:12:45.77640073 +0000 UTC m=+844.701239358" watchObservedRunningTime="2025-12-09 15:12:45.779165314 +0000 UTC m=+844.704003942" Dec 09 15:12:50 crc kubenswrapper[4735]: I1209 15:12:50.495321 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-gkqpg" Dec 09 15:12:50 crc kubenswrapper[4735]: I1209 15:12:50.732231 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:50 crc kubenswrapper[4735]: I1209 15:12:50.732551 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:50 crc kubenswrapper[4735]: I1209 15:12:50.738638 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:50 crc kubenswrapper[4735]: I1209 15:12:50.794056 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cb789c5f6-k2kzk" Dec 09 15:12:50 crc kubenswrapper[4735]: I1209 15:12:50.834202 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-c4mlr"] Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.367432 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9dxks"] Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.369137 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.379000 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9dxks"] Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.460855 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-catalog-content\") pod \"certified-operators-9dxks\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.461083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gzf6\" (UniqueName: \"kubernetes.io/projected/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-kube-api-access-7gzf6\") pod \"certified-operators-9dxks\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.461223 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-utilities\") pod \"certified-operators-9dxks\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.562354 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-utilities\") pod \"certified-operators-9dxks\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.562425 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-catalog-content\") pod \"certified-operators-9dxks\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.562601 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gzf6\" (UniqueName: \"kubernetes.io/projected/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-kube-api-access-7gzf6\") pod \"certified-operators-9dxks\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.562815 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-utilities\") pod \"certified-operators-9dxks\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.562861 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-catalog-content\") pod \"certified-operators-9dxks\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.596278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gzf6\" (UniqueName: \"kubernetes.io/projected/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-kube-api-access-7gzf6\") pod \"certified-operators-9dxks\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:12:54 crc kubenswrapper[4735]: I1209 15:12:54.687982 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:12:55 crc kubenswrapper[4735]: I1209 15:12:55.106066 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9dxks"] Dec 09 15:12:55 crc kubenswrapper[4735]: W1209 15:12:55.113531 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8340fac0_7a6b_4c62_8c56_a7d3764dfc65.slice/crio-b66d4059d77d27ff3a2a1691ab34283e56ff803dcdf508c7001cf2756b831a0e WatchSource:0}: Error finding container b66d4059d77d27ff3a2a1691ab34283e56ff803dcdf508c7001cf2756b831a0e: Status 404 returned error can't find the container with id b66d4059d77d27ff3a2a1691ab34283e56ff803dcdf508c7001cf2756b831a0e Dec 09 15:12:55 crc kubenswrapper[4735]: I1209 15:12:55.829973 4735 generic.go:334] "Generic (PLEG): container finished" podID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" containerID="84ddecb81ea8ebcc2348f27822d383dfbb4f15270a41f27ffd74a4f032156b1e" exitCode=0 Dec 09 15:12:55 crc kubenswrapper[4735]: I1209 15:12:55.830027 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9dxks" event={"ID":"8340fac0-7a6b-4c62-8c56-a7d3764dfc65","Type":"ContainerDied","Data":"84ddecb81ea8ebcc2348f27822d383dfbb4f15270a41f27ffd74a4f032156b1e"} Dec 09 15:12:55 crc kubenswrapper[4735]: I1209 15:12:55.830226 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9dxks" event={"ID":"8340fac0-7a6b-4c62-8c56-a7d3764dfc65","Type":"ContainerStarted","Data":"b66d4059d77d27ff3a2a1691ab34283e56ff803dcdf508c7001cf2756b831a0e"} Dec 09 15:12:56 crc kubenswrapper[4735]: I1209 15:12:56.838385 4735 generic.go:334] "Generic (PLEG): container finished" podID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" containerID="bab421cc849ee619a9656698d7feb41a09600a304561c96c2dad339d8e58fd48" exitCode=0 Dec 09 15:12:56 crc kubenswrapper[4735]: I1209 15:12:56.838448 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9dxks" event={"ID":"8340fac0-7a6b-4c62-8c56-a7d3764dfc65","Type":"ContainerDied","Data":"bab421cc849ee619a9656698d7feb41a09600a304561c96c2dad339d8e58fd48"} Dec 09 15:12:57 crc kubenswrapper[4735]: I1209 15:12:57.846251 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9dxks" event={"ID":"8340fac0-7a6b-4c62-8c56-a7d3764dfc65","Type":"ContainerStarted","Data":"1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c"} Dec 09 15:12:57 crc kubenswrapper[4735]: I1209 15:12:57.860932 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9dxks" podStartSLOduration=2.374890404 podStartE2EDuration="3.86091969s" podCreationTimestamp="2025-12-09 15:12:54 +0000 UTC" firstStartedPulling="2025-12-09 15:12:55.831975186 +0000 UTC m=+854.756813814" lastFinishedPulling="2025-12-09 15:12:57.318004472 +0000 UTC m=+856.242843100" observedRunningTime="2025-12-09 15:12:57.859082913 +0000 UTC m=+856.783921541" watchObservedRunningTime="2025-12-09 15:12:57.86091969 +0000 UTC m=+856.785758318" Dec 09 15:13:00 crc kubenswrapper[4735]: I1209 15:13:00.505588 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-jrtqv" Dec 09 15:13:04 crc kubenswrapper[4735]: I1209 15:13:04.335572 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:13:04 crc kubenswrapper[4735]: I1209 15:13:04.335970 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:13:04 crc kubenswrapper[4735]: I1209 15:13:04.689066 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:13:04 crc kubenswrapper[4735]: I1209 15:13:04.689119 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:13:04 crc kubenswrapper[4735]: I1209 15:13:04.718139 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:13:04 crc kubenswrapper[4735]: I1209 15:13:04.909136 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:13:04 crc kubenswrapper[4735]: I1209 15:13:04.940556 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9dxks"] Dec 09 15:13:06 crc kubenswrapper[4735]: I1209 15:13:06.890077 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9dxks" podUID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" containerName="registry-server" containerID="cri-o://1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c" gracePeriod=2 Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.227296 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.332422 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-catalog-content\") pod \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.332454 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-utilities\") pod \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.332582 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gzf6\" (UniqueName: \"kubernetes.io/projected/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-kube-api-access-7gzf6\") pod \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\" (UID: \"8340fac0-7a6b-4c62-8c56-a7d3764dfc65\") " Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.333874 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-utilities" (OuterVolumeSpecName: "utilities") pod "8340fac0-7a6b-4c62-8c56-a7d3764dfc65" (UID: "8340fac0-7a6b-4c62-8c56-a7d3764dfc65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.338115 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-kube-api-access-7gzf6" (OuterVolumeSpecName: "kube-api-access-7gzf6") pod "8340fac0-7a6b-4c62-8c56-a7d3764dfc65" (UID: "8340fac0-7a6b-4c62-8c56-a7d3764dfc65"). InnerVolumeSpecName "kube-api-access-7gzf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.370593 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8340fac0-7a6b-4c62-8c56-a7d3764dfc65" (UID: "8340fac0-7a6b-4c62-8c56-a7d3764dfc65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.435000 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.435264 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.435279 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gzf6\" (UniqueName: \"kubernetes.io/projected/8340fac0-7a6b-4c62-8c56-a7d3764dfc65-kube-api-access-7gzf6\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.897588 4735 generic.go:334] "Generic (PLEG): container finished" podID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" containerID="1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c" exitCode=0 Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.897636 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9dxks" event={"ID":"8340fac0-7a6b-4c62-8c56-a7d3764dfc65","Type":"ContainerDied","Data":"1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c"} Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.897665 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9dxks" event={"ID":"8340fac0-7a6b-4c62-8c56-a7d3764dfc65","Type":"ContainerDied","Data":"b66d4059d77d27ff3a2a1691ab34283e56ff803dcdf508c7001cf2756b831a0e"} Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.897682 4735 scope.go:117] "RemoveContainer" containerID="1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.897857 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9dxks" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.924324 4735 scope.go:117] "RemoveContainer" containerID="bab421cc849ee619a9656698d7feb41a09600a304561c96c2dad339d8e58fd48" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.924404 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9dxks"] Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.926008 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9dxks"] Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.942333 4735 scope.go:117] "RemoveContainer" containerID="84ddecb81ea8ebcc2348f27822d383dfbb4f15270a41f27ffd74a4f032156b1e" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.959230 4735 scope.go:117] "RemoveContainer" containerID="1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c" Dec 09 15:13:07 crc kubenswrapper[4735]: E1209 15:13:07.959625 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c\": container with ID starting with 1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c not found: ID does not exist" containerID="1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.959661 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c"} err="failed to get container status \"1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c\": rpc error: code = NotFound desc = could not find container \"1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c\": container with ID starting with 1c02e1a49e7e0f3404824ec18dae6353afa962dc3d9d4ae345548d9aaeb16a0c not found: ID does not exist" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.959686 4735 scope.go:117] "RemoveContainer" containerID="bab421cc849ee619a9656698d7feb41a09600a304561c96c2dad339d8e58fd48" Dec 09 15:13:07 crc kubenswrapper[4735]: E1209 15:13:07.960129 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab421cc849ee619a9656698d7feb41a09600a304561c96c2dad339d8e58fd48\": container with ID starting with bab421cc849ee619a9656698d7feb41a09600a304561c96c2dad339d8e58fd48 not found: ID does not exist" containerID="bab421cc849ee619a9656698d7feb41a09600a304561c96c2dad339d8e58fd48" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.960176 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab421cc849ee619a9656698d7feb41a09600a304561c96c2dad339d8e58fd48"} err="failed to get container status \"bab421cc849ee619a9656698d7feb41a09600a304561c96c2dad339d8e58fd48\": rpc error: code = NotFound desc = could not find container \"bab421cc849ee619a9656698d7feb41a09600a304561c96c2dad339d8e58fd48\": container with ID starting with bab421cc849ee619a9656698d7feb41a09600a304561c96c2dad339d8e58fd48 not found: ID does not exist" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.960195 4735 scope.go:117] "RemoveContainer" containerID="84ddecb81ea8ebcc2348f27822d383dfbb4f15270a41f27ffd74a4f032156b1e" Dec 09 15:13:07 crc kubenswrapper[4735]: E1209 15:13:07.960458 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ddecb81ea8ebcc2348f27822d383dfbb4f15270a41f27ffd74a4f032156b1e\": container with ID starting with 84ddecb81ea8ebcc2348f27822d383dfbb4f15270a41f27ffd74a4f032156b1e not found: ID does not exist" containerID="84ddecb81ea8ebcc2348f27822d383dfbb4f15270a41f27ffd74a4f032156b1e" Dec 09 15:13:07 crc kubenswrapper[4735]: I1209 15:13:07.960482 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ddecb81ea8ebcc2348f27822d383dfbb4f15270a41f27ffd74a4f032156b1e"} err="failed to get container status \"84ddecb81ea8ebcc2348f27822d383dfbb4f15270a41f27ffd74a4f032156b1e\": rpc error: code = NotFound desc = could not find container \"84ddecb81ea8ebcc2348f27822d383dfbb4f15270a41f27ffd74a4f032156b1e\": container with ID starting with 84ddecb81ea8ebcc2348f27822d383dfbb4f15270a41f27ffd74a4f032156b1e not found: ID does not exist" Dec 09 15:13:09 crc kubenswrapper[4735]: I1209 15:13:09.420663 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" path="/var/lib/kubelet/pods/8340fac0-7a6b-4c62-8c56-a7d3764dfc65/volumes" Dec 09 15:13:11 crc kubenswrapper[4735]: I1209 15:13:11.868724 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m"] Dec 09 15:13:11 crc kubenswrapper[4735]: E1209 15:13:11.869651 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" containerName="extract-utilities" Dec 09 15:13:11 crc kubenswrapper[4735]: I1209 15:13:11.869666 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" containerName="extract-utilities" Dec 09 15:13:11 crc kubenswrapper[4735]: E1209 15:13:11.869692 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" containerName="extract-content" Dec 09 15:13:11 crc kubenswrapper[4735]: I1209 15:13:11.869699 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" containerName="extract-content" Dec 09 15:13:11 crc kubenswrapper[4735]: E1209 15:13:11.869712 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" containerName="registry-server" Dec 09 15:13:11 crc kubenswrapper[4735]: I1209 15:13:11.869719 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" containerName="registry-server" Dec 09 15:13:11 crc kubenswrapper[4735]: I1209 15:13:11.869877 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8340fac0-7a6b-4c62-8c56-a7d3764dfc65" containerName="registry-server" Dec 09 15:13:11 crc kubenswrapper[4735]: I1209 15:13:11.871020 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:11 crc kubenswrapper[4735]: I1209 15:13:11.873957 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 09 15:13:11 crc kubenswrapper[4735]: I1209 15:13:11.879584 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m"] Dec 09 15:13:11 crc kubenswrapper[4735]: I1209 15:13:11.905277 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:11 crc kubenswrapper[4735]: I1209 15:13:11.905387 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9fzz\" (UniqueName: \"kubernetes.io/projected/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-kube-api-access-p9fzz\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:11 crc kubenswrapper[4735]: I1209 15:13:11.905420 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:12 crc kubenswrapper[4735]: I1209 15:13:12.006550 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9fzz\" (UniqueName: \"kubernetes.io/projected/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-kube-api-access-p9fzz\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:12 crc kubenswrapper[4735]: I1209 15:13:12.006606 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:12 crc kubenswrapper[4735]: I1209 15:13:12.006744 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:12 crc kubenswrapper[4735]: I1209 15:13:12.007192 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:12 crc kubenswrapper[4735]: I1209 15:13:12.007341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:12 crc kubenswrapper[4735]: I1209 15:13:12.023201 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9fzz\" (UniqueName: \"kubernetes.io/projected/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-kube-api-access-p9fzz\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:12 crc kubenswrapper[4735]: I1209 15:13:12.188040 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:12 crc kubenswrapper[4735]: I1209 15:13:12.447369 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m"] Dec 09 15:13:12 crc kubenswrapper[4735]: I1209 15:13:12.931482 4735 generic.go:334] "Generic (PLEG): container finished" podID="bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" containerID="62e71e3bd234a83323b0e2501158756496ca169b07489addcbb94f8729703454" exitCode=0 Dec 09 15:13:12 crc kubenswrapper[4735]: I1209 15:13:12.931583 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" event={"ID":"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac","Type":"ContainerDied","Data":"62e71e3bd234a83323b0e2501158756496ca169b07489addcbb94f8729703454"} Dec 09 15:13:12 crc kubenswrapper[4735]: I1209 15:13:12.931863 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" event={"ID":"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac","Type":"ContainerStarted","Data":"5ddc37405640fc8ad2a60287613e85808fb67ef56bf0db664853cd9d1dc90aad"} Dec 09 15:13:14 crc kubenswrapper[4735]: I1209 15:13:14.945095 4735 generic.go:334] "Generic (PLEG): container finished" podID="bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" containerID="2a71c1a1b37b30e49da474d011e4af199fa391fda5860c5865fb8eaa169544c8" exitCode=0 Dec 09 15:13:14 crc kubenswrapper[4735]: I1209 15:13:14.945181 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" event={"ID":"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac","Type":"ContainerDied","Data":"2a71c1a1b37b30e49da474d011e4af199fa391fda5860c5865fb8eaa169544c8"} Dec 09 15:13:15 crc kubenswrapper[4735]: I1209 15:13:15.866188 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-c4mlr" podUID="bfe12755-b370-474e-b856-82522f9b38d0" containerName="console" containerID="cri-o://f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368" gracePeriod=15 Dec 09 15:13:15 crc kubenswrapper[4735]: I1209 15:13:15.952250 4735 generic.go:334] "Generic (PLEG): container finished" podID="bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" containerID="cff0153af122b95b469009945140477a75bf8539e6997489d60df465bb5c8901" exitCode=0 Dec 09 15:13:15 crc kubenswrapper[4735]: I1209 15:13:15.952290 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" event={"ID":"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac","Type":"ContainerDied","Data":"cff0153af122b95b469009945140477a75bf8539e6997489d60df465bb5c8901"} Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.192942 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-c4mlr_bfe12755-b370-474e-b856-82522f9b38d0/console/0.log" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.193022 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.280470 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-serving-cert\") pod \"bfe12755-b370-474e-b856-82522f9b38d0\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.280527 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-oauth-config\") pod \"bfe12755-b370-474e-b856-82522f9b38d0\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.280569 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-oauth-serving-cert\") pod \"bfe12755-b370-474e-b856-82522f9b38d0\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.280637 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-console-config\") pod \"bfe12755-b370-474e-b856-82522f9b38d0\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.280690 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vbxs\" (UniqueName: \"kubernetes.io/projected/bfe12755-b370-474e-b856-82522f9b38d0-kube-api-access-6vbxs\") pod \"bfe12755-b370-474e-b856-82522f9b38d0\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.280709 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-service-ca\") pod \"bfe12755-b370-474e-b856-82522f9b38d0\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.280798 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-trusted-ca-bundle\") pod \"bfe12755-b370-474e-b856-82522f9b38d0\" (UID: \"bfe12755-b370-474e-b856-82522f9b38d0\") " Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.281386 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-console-config" (OuterVolumeSpecName: "console-config") pod "bfe12755-b370-474e-b856-82522f9b38d0" (UID: "bfe12755-b370-474e-b856-82522f9b38d0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.281400 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bfe12755-b370-474e-b856-82522f9b38d0" (UID: "bfe12755-b370-474e-b856-82522f9b38d0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.281559 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bfe12755-b370-474e-b856-82522f9b38d0" (UID: "bfe12755-b370-474e-b856-82522f9b38d0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.281677 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-service-ca" (OuterVolumeSpecName: "service-ca") pod "bfe12755-b370-474e-b856-82522f9b38d0" (UID: "bfe12755-b370-474e-b856-82522f9b38d0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.286294 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bfe12755-b370-474e-b856-82522f9b38d0" (UID: "bfe12755-b370-474e-b856-82522f9b38d0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.286773 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe12755-b370-474e-b856-82522f9b38d0-kube-api-access-6vbxs" (OuterVolumeSpecName: "kube-api-access-6vbxs") pod "bfe12755-b370-474e-b856-82522f9b38d0" (UID: "bfe12755-b370-474e-b856-82522f9b38d0"). InnerVolumeSpecName "kube-api-access-6vbxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.286810 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bfe12755-b370-474e-b856-82522f9b38d0" (UID: "bfe12755-b370-474e-b856-82522f9b38d0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.382608 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.382642 4735 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.382654 4735 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfe12755-b370-474e-b856-82522f9b38d0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.382664 4735 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.382672 4735 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-console-config\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.382681 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vbxs\" (UniqueName: \"kubernetes.io/projected/bfe12755-b370-474e-b856-82522f9b38d0-kube-api-access-6vbxs\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.382691 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfe12755-b370-474e-b856-82522f9b38d0-service-ca\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.957877 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-c4mlr_bfe12755-b370-474e-b856-82522f9b38d0/console/0.log" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.957925 4735 generic.go:334] "Generic (PLEG): container finished" podID="bfe12755-b370-474e-b856-82522f9b38d0" containerID="f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368" exitCode=2 Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.957973 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c4mlr" event={"ID":"bfe12755-b370-474e-b856-82522f9b38d0","Type":"ContainerDied","Data":"f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368"} Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.957992 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-c4mlr" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.958019 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-c4mlr" event={"ID":"bfe12755-b370-474e-b856-82522f9b38d0","Type":"ContainerDied","Data":"9818553b5a097eac8b42f22ae22b0f4b0a6be16a4abec3dcf7163e3668061a99"} Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.958050 4735 scope.go:117] "RemoveContainer" containerID="f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.984297 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-c4mlr"] Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.986175 4735 scope.go:117] "RemoveContainer" containerID="f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368" Dec 09 15:13:16 crc kubenswrapper[4735]: E1209 15:13:16.987558 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368\": container with ID starting with f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368 not found: ID does not exist" containerID="f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.987609 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368"} err="failed to get container status \"f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368\": rpc error: code = NotFound desc = could not find container \"f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368\": container with ID starting with f319bd23938c8a186aa5afda0ce870edccab9ef8c088edcb79a3fb54308cc368 not found: ID does not exist" Dec 09 15:13:16 crc kubenswrapper[4735]: I1209 15:13:16.988282 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-c4mlr"] Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.191639 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.294702 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-util\") pod \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.295058 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-bundle\") pod \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.295460 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9fzz\" (UniqueName: \"kubernetes.io/projected/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-kube-api-access-p9fzz\") pod \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\" (UID: \"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac\") " Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.295857 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-bundle" (OuterVolumeSpecName: "bundle") pod "bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" (UID: "bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.296149 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-bundle\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.299208 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-kube-api-access-p9fzz" (OuterVolumeSpecName: "kube-api-access-p9fzz") pod "bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" (UID: "bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac"). InnerVolumeSpecName "kube-api-access-p9fzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.305115 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-util" (OuterVolumeSpecName: "util") pod "bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" (UID: "bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.397490 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-util\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.397539 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9fzz\" (UniqueName: \"kubernetes.io/projected/bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac-kube-api-access-p9fzz\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.422657 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe12755-b370-474e-b856-82522f9b38d0" path="/var/lib/kubelet/pods/bfe12755-b370-474e-b856-82522f9b38d0/volumes" Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.966678 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" event={"ID":"bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac","Type":"ContainerDied","Data":"5ddc37405640fc8ad2a60287613e85808fb67ef56bf0db664853cd9d1dc90aad"} Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.966719 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ddc37405640fc8ad2a60287613e85808fb67ef56bf0db664853cd9d1dc90aad" Dec 09 15:13:17 crc kubenswrapper[4735]: I1209 15:13:17.966693 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.434664 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dfh9l"] Dec 09 15:13:23 crc kubenswrapper[4735]: E1209 15:13:23.435149 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" containerName="util" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.435163 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" containerName="util" Dec 09 15:13:23 crc kubenswrapper[4735]: E1209 15:13:23.435176 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" containerName="pull" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.435181 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" containerName="pull" Dec 09 15:13:23 crc kubenswrapper[4735]: E1209 15:13:23.435198 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" containerName="extract" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.435204 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" containerName="extract" Dec 09 15:13:23 crc kubenswrapper[4735]: E1209 15:13:23.435215 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe12755-b370-474e-b856-82522f9b38d0" containerName="console" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.435220 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe12755-b370-474e-b856-82522f9b38d0" containerName="console" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.435325 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe12755-b370-474e-b856-82522f9b38d0" containerName="console" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.435342 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac" containerName="extract" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.436163 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.443388 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfh9l"] Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.491289 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-utilities\") pod \"community-operators-dfh9l\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.491354 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2bkm\" (UniqueName: \"kubernetes.io/projected/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-kube-api-access-l2bkm\") pod \"community-operators-dfh9l\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.491402 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-catalog-content\") pod \"community-operators-dfh9l\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.593058 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-utilities\") pod \"community-operators-dfh9l\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.593120 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2bkm\" (UniqueName: \"kubernetes.io/projected/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-kube-api-access-l2bkm\") pod \"community-operators-dfh9l\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.593171 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-catalog-content\") pod \"community-operators-dfh9l\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.593755 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-utilities\") pod \"community-operators-dfh9l\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.593794 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-catalog-content\") pod \"community-operators-dfh9l\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.611994 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2bkm\" (UniqueName: \"kubernetes.io/projected/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-kube-api-access-l2bkm\") pod \"community-operators-dfh9l\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:23 crc kubenswrapper[4735]: I1209 15:13:23.750006 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.154123 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dfh9l"] Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.234097 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vpflf"] Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.235716 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.245803 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpflf"] Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.310390 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-catalog-content\") pod \"redhat-marketplace-vpflf\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.310468 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j7nl\" (UniqueName: \"kubernetes.io/projected/71e2df4b-2240-4a2a-877d-ee3f317a2e70-kube-api-access-8j7nl\") pod \"redhat-marketplace-vpflf\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.310565 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-utilities\") pod \"redhat-marketplace-vpflf\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.411886 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j7nl\" (UniqueName: \"kubernetes.io/projected/71e2df4b-2240-4a2a-877d-ee3f317a2e70-kube-api-access-8j7nl\") pod \"redhat-marketplace-vpflf\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.411985 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-utilities\") pod \"redhat-marketplace-vpflf\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.412025 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-catalog-content\") pod \"redhat-marketplace-vpflf\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.412446 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-utilities\") pod \"redhat-marketplace-vpflf\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.412472 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-catalog-content\") pod \"redhat-marketplace-vpflf\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.430641 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j7nl\" (UniqueName: \"kubernetes.io/projected/71e2df4b-2240-4a2a-877d-ee3f317a2e70-kube-api-access-8j7nl\") pod \"redhat-marketplace-vpflf\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.548845 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:24 crc kubenswrapper[4735]: I1209 15:13:24.929081 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpflf"] Dec 09 15:13:24 crc kubenswrapper[4735]: W1209 15:13:24.937697 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71e2df4b_2240_4a2a_877d_ee3f317a2e70.slice/crio-fe5f50901cb86f9d48ffd8dfb591f7151289ed44acdc1dacaadbdbd7294be5dc WatchSource:0}: Error finding container fe5f50901cb86f9d48ffd8dfb591f7151289ed44acdc1dacaadbdbd7294be5dc: Status 404 returned error can't find the container with id fe5f50901cb86f9d48ffd8dfb591f7151289ed44acdc1dacaadbdbd7294be5dc Dec 09 15:13:25 crc kubenswrapper[4735]: I1209 15:13:25.015464 4735 generic.go:334] "Generic (PLEG): container finished" podID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" containerID="d03b7f85866625b66931ef13746e188e851fd1948664967443107f3907253212" exitCode=0 Dec 09 15:13:25 crc kubenswrapper[4735]: I1209 15:13:25.015800 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfh9l" event={"ID":"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9","Type":"ContainerDied","Data":"d03b7f85866625b66931ef13746e188e851fd1948664967443107f3907253212"} Dec 09 15:13:25 crc kubenswrapper[4735]: I1209 15:13:25.015856 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfh9l" event={"ID":"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9","Type":"ContainerStarted","Data":"5d824173d6eeb32e77889c0350144998de540fb00a3e136d17681b7b7742dec6"} Dec 09 15:13:25 crc kubenswrapper[4735]: I1209 15:13:25.017919 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpflf" event={"ID":"71e2df4b-2240-4a2a-877d-ee3f317a2e70","Type":"ContainerStarted","Data":"fe5f50901cb86f9d48ffd8dfb591f7151289ed44acdc1dacaadbdbd7294be5dc"} Dec 09 15:13:25 crc kubenswrapper[4735]: I1209 15:13:25.017988 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:13:26 crc kubenswrapper[4735]: I1209 15:13:26.025063 4735 generic.go:334] "Generic (PLEG): container finished" podID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" containerID="510e43094ab5a77e43a72df618fda17501adf97274c7a68e1771b13b77e43d9f" exitCode=0 Dec 09 15:13:26 crc kubenswrapper[4735]: I1209 15:13:26.025107 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfh9l" event={"ID":"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9","Type":"ContainerDied","Data":"510e43094ab5a77e43a72df618fda17501adf97274c7a68e1771b13b77e43d9f"} Dec 09 15:13:26 crc kubenswrapper[4735]: I1209 15:13:26.026782 4735 generic.go:334] "Generic (PLEG): container finished" podID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" containerID="0d619734cfec34fb0c22b81e51b317152e30f6a6c9fa8a99ba177308d17c3a04" exitCode=0 Dec 09 15:13:26 crc kubenswrapper[4735]: I1209 15:13:26.026840 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpflf" event={"ID":"71e2df4b-2240-4a2a-877d-ee3f317a2e70","Type":"ContainerDied","Data":"0d619734cfec34fb0c22b81e51b317152e30f6a6c9fa8a99ba177308d17c3a04"} Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.034278 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfh9l" event={"ID":"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9","Type":"ContainerStarted","Data":"1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700"} Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.035984 4735 generic.go:334] "Generic (PLEG): container finished" podID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" containerID="a260423c0773bf7d8b43602a8097a15dc8db647fe2fbfd9772c325646c4b0e16" exitCode=0 Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.036051 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpflf" event={"ID":"71e2df4b-2240-4a2a-877d-ee3f317a2e70","Type":"ContainerDied","Data":"a260423c0773bf7d8b43602a8097a15dc8db647fe2fbfd9772c325646c4b0e16"} Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.048304 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dfh9l" podStartSLOduration=2.579666726 podStartE2EDuration="4.04829089s" podCreationTimestamp="2025-12-09 15:13:23 +0000 UTC" firstStartedPulling="2025-12-09 15:13:25.017683234 +0000 UTC m=+883.942521862" lastFinishedPulling="2025-12-09 15:13:26.486307398 +0000 UTC m=+885.411146026" observedRunningTime="2025-12-09 15:13:27.047431842 +0000 UTC m=+885.972270471" watchObservedRunningTime="2025-12-09 15:13:27.04829089 +0000 UTC m=+885.973129518" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.322333 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp"] Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.323117 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.324373 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.326055 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hjszz" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.326080 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.326081 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.326159 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.338999 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp"] Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.357718 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4e80d858-b2b3-4009-b421-f9b227ee3873-apiservice-cert\") pod \"metallb-operator-controller-manager-577d56cdf4-lrqmp\" (UID: \"4e80d858-b2b3-4009-b421-f9b227ee3873\") " pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.357773 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7jzs\" (UniqueName: \"kubernetes.io/projected/4e80d858-b2b3-4009-b421-f9b227ee3873-kube-api-access-l7jzs\") pod \"metallb-operator-controller-manager-577d56cdf4-lrqmp\" (UID: \"4e80d858-b2b3-4009-b421-f9b227ee3873\") " pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.357824 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4e80d858-b2b3-4009-b421-f9b227ee3873-webhook-cert\") pod \"metallb-operator-controller-manager-577d56cdf4-lrqmp\" (UID: \"4e80d858-b2b3-4009-b421-f9b227ee3873\") " pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.462255 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7jzs\" (UniqueName: \"kubernetes.io/projected/4e80d858-b2b3-4009-b421-f9b227ee3873-kube-api-access-l7jzs\") pod \"metallb-operator-controller-manager-577d56cdf4-lrqmp\" (UID: \"4e80d858-b2b3-4009-b421-f9b227ee3873\") " pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.462616 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4e80d858-b2b3-4009-b421-f9b227ee3873-webhook-cert\") pod \"metallb-operator-controller-manager-577d56cdf4-lrqmp\" (UID: \"4e80d858-b2b3-4009-b421-f9b227ee3873\") " pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.462888 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4e80d858-b2b3-4009-b421-f9b227ee3873-apiservice-cert\") pod \"metallb-operator-controller-manager-577d56cdf4-lrqmp\" (UID: \"4e80d858-b2b3-4009-b421-f9b227ee3873\") " pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.472262 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4e80d858-b2b3-4009-b421-f9b227ee3873-webhook-cert\") pod \"metallb-operator-controller-manager-577d56cdf4-lrqmp\" (UID: \"4e80d858-b2b3-4009-b421-f9b227ee3873\") " pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.479066 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4e80d858-b2b3-4009-b421-f9b227ee3873-apiservice-cert\") pod \"metallb-operator-controller-manager-577d56cdf4-lrqmp\" (UID: \"4e80d858-b2b3-4009-b421-f9b227ee3873\") " pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.484072 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7jzs\" (UniqueName: \"kubernetes.io/projected/4e80d858-b2b3-4009-b421-f9b227ee3873-kube-api-access-l7jzs\") pod \"metallb-operator-controller-manager-577d56cdf4-lrqmp\" (UID: \"4e80d858-b2b3-4009-b421-f9b227ee3873\") " pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.636932 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.677809 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx"] Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.679221 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.682466 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.682560 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.682766 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7nlqz" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.694409 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx"] Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.768088 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca532396-6f51-428d-b4ff-ac3bf1920207-apiservice-cert\") pod \"metallb-operator-webhook-server-86bc98687f-dcjrx\" (UID: \"ca532396-6f51-428d-b4ff-ac3bf1920207\") " pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.768161 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrbl6\" (UniqueName: \"kubernetes.io/projected/ca532396-6f51-428d-b4ff-ac3bf1920207-kube-api-access-xrbl6\") pod \"metallb-operator-webhook-server-86bc98687f-dcjrx\" (UID: \"ca532396-6f51-428d-b4ff-ac3bf1920207\") " pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.768200 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca532396-6f51-428d-b4ff-ac3bf1920207-webhook-cert\") pod \"metallb-operator-webhook-server-86bc98687f-dcjrx\" (UID: \"ca532396-6f51-428d-b4ff-ac3bf1920207\") " pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.870461 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca532396-6f51-428d-b4ff-ac3bf1920207-apiservice-cert\") pod \"metallb-operator-webhook-server-86bc98687f-dcjrx\" (UID: \"ca532396-6f51-428d-b4ff-ac3bf1920207\") " pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.870554 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbl6\" (UniqueName: \"kubernetes.io/projected/ca532396-6f51-428d-b4ff-ac3bf1920207-kube-api-access-xrbl6\") pod \"metallb-operator-webhook-server-86bc98687f-dcjrx\" (UID: \"ca532396-6f51-428d-b4ff-ac3bf1920207\") " pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.870587 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca532396-6f51-428d-b4ff-ac3bf1920207-webhook-cert\") pod \"metallb-operator-webhook-server-86bc98687f-dcjrx\" (UID: \"ca532396-6f51-428d-b4ff-ac3bf1920207\") " pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.876995 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca532396-6f51-428d-b4ff-ac3bf1920207-webhook-cert\") pod \"metallb-operator-webhook-server-86bc98687f-dcjrx\" (UID: \"ca532396-6f51-428d-b4ff-ac3bf1920207\") " pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.877020 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca532396-6f51-428d-b4ff-ac3bf1920207-apiservice-cert\") pod \"metallb-operator-webhook-server-86bc98687f-dcjrx\" (UID: \"ca532396-6f51-428d-b4ff-ac3bf1920207\") " pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:27 crc kubenswrapper[4735]: I1209 15:13:27.887948 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrbl6\" (UniqueName: \"kubernetes.io/projected/ca532396-6f51-428d-b4ff-ac3bf1920207-kube-api-access-xrbl6\") pod \"metallb-operator-webhook-server-86bc98687f-dcjrx\" (UID: \"ca532396-6f51-428d-b4ff-ac3bf1920207\") " pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:28 crc kubenswrapper[4735]: I1209 15:13:28.014880 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:28 crc kubenswrapper[4735]: I1209 15:13:28.045023 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpflf" event={"ID":"71e2df4b-2240-4a2a-877d-ee3f317a2e70","Type":"ContainerStarted","Data":"a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d"} Dec 09 15:13:28 crc kubenswrapper[4735]: I1209 15:13:28.063000 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vpflf" podStartSLOduration=2.5167204869999997 podStartE2EDuration="4.06298573s" podCreationTimestamp="2025-12-09 15:13:24 +0000 UTC" firstStartedPulling="2025-12-09 15:13:26.028301602 +0000 UTC m=+884.953140231" lastFinishedPulling="2025-12-09 15:13:27.574566845 +0000 UTC m=+886.499405474" observedRunningTime="2025-12-09 15:13:28.059453852 +0000 UTC m=+886.984292480" watchObservedRunningTime="2025-12-09 15:13:28.06298573 +0000 UTC m=+886.987824358" Dec 09 15:13:28 crc kubenswrapper[4735]: I1209 15:13:28.106439 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp"] Dec 09 15:13:28 crc kubenswrapper[4735]: W1209 15:13:28.124569 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e80d858_b2b3_4009_b421_f9b227ee3873.slice/crio-6c9e21404a177d6fbd09107a1af433034883ab65c63a1cde98ea5a173c7672e1 WatchSource:0}: Error finding container 6c9e21404a177d6fbd09107a1af433034883ab65c63a1cde98ea5a173c7672e1: Status 404 returned error can't find the container with id 6c9e21404a177d6fbd09107a1af433034883ab65c63a1cde98ea5a173c7672e1 Dec 09 15:13:28 crc kubenswrapper[4735]: I1209 15:13:28.426818 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx"] Dec 09 15:13:28 crc kubenswrapper[4735]: W1209 15:13:28.431581 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca532396_6f51_428d_b4ff_ac3bf1920207.slice/crio-730c1f6eb808c5bebd39d978fb92b1d56715f3fb735729e4554a19d78a5c8345 WatchSource:0}: Error finding container 730c1f6eb808c5bebd39d978fb92b1d56715f3fb735729e4554a19d78a5c8345: Status 404 returned error can't find the container with id 730c1f6eb808c5bebd39d978fb92b1d56715f3fb735729e4554a19d78a5c8345 Dec 09 15:13:29 crc kubenswrapper[4735]: I1209 15:13:29.051288 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" event={"ID":"ca532396-6f51-428d-b4ff-ac3bf1920207","Type":"ContainerStarted","Data":"730c1f6eb808c5bebd39d978fb92b1d56715f3fb735729e4554a19d78a5c8345"} Dec 09 15:13:29 crc kubenswrapper[4735]: I1209 15:13:29.052160 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" event={"ID":"4e80d858-b2b3-4009-b421-f9b227ee3873","Type":"ContainerStarted","Data":"6c9e21404a177d6fbd09107a1af433034883ab65c63a1cde98ea5a173c7672e1"} Dec 09 15:13:31 crc kubenswrapper[4735]: I1209 15:13:31.065635 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" event={"ID":"4e80d858-b2b3-4009-b421-f9b227ee3873","Type":"ContainerStarted","Data":"822ae0eda3172f10ca06cd7db4ee34819a39f16bd5270a53b6abb03dbe917fde"} Dec 09 15:13:31 crc kubenswrapper[4735]: I1209 15:13:31.066303 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:13:31 crc kubenswrapper[4735]: I1209 15:13:31.088284 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" podStartSLOduration=1.700276581 podStartE2EDuration="4.088255877s" podCreationTimestamp="2025-12-09 15:13:27 +0000 UTC" firstStartedPulling="2025-12-09 15:13:28.129678522 +0000 UTC m=+887.054517150" lastFinishedPulling="2025-12-09 15:13:30.517657819 +0000 UTC m=+889.442496446" observedRunningTime="2025-12-09 15:13:31.085343294 +0000 UTC m=+890.010181922" watchObservedRunningTime="2025-12-09 15:13:31.088255877 +0000 UTC m=+890.013094505" Dec 09 15:13:33 crc kubenswrapper[4735]: I1209 15:13:33.750799 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:33 crc kubenswrapper[4735]: I1209 15:13:33.751173 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:33 crc kubenswrapper[4735]: I1209 15:13:33.783884 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:34 crc kubenswrapper[4735]: I1209 15:13:34.117764 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:34 crc kubenswrapper[4735]: I1209 15:13:34.335981 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:13:34 crc kubenswrapper[4735]: I1209 15:13:34.336041 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:13:34 crc kubenswrapper[4735]: I1209 15:13:34.550792 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:34 crc kubenswrapper[4735]: I1209 15:13:34.550849 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:34 crc kubenswrapper[4735]: I1209 15:13:34.583856 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:35 crc kubenswrapper[4735]: I1209 15:13:35.023305 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dfh9l"] Dec 09 15:13:35 crc kubenswrapper[4735]: I1209 15:13:35.090719 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" event={"ID":"ca532396-6f51-428d-b4ff-ac3bf1920207","Type":"ContainerStarted","Data":"8f1bac58253eaf984c619fa75c01c049f0908ee5f740a49e2f548761f23d95ba"} Dec 09 15:13:35 crc kubenswrapper[4735]: I1209 15:13:35.091697 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:13:35 crc kubenswrapper[4735]: I1209 15:13:35.111188 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" podStartSLOduration=1.8399036519999998 podStartE2EDuration="8.111162323s" podCreationTimestamp="2025-12-09 15:13:27 +0000 UTC" firstStartedPulling="2025-12-09 15:13:28.435226994 +0000 UTC m=+887.360065621" lastFinishedPulling="2025-12-09 15:13:34.706485665 +0000 UTC m=+893.631324292" observedRunningTime="2025-12-09 15:13:35.107220674 +0000 UTC m=+894.032059302" watchObservedRunningTime="2025-12-09 15:13:35.111162323 +0000 UTC m=+894.036000952" Dec 09 15:13:35 crc kubenswrapper[4735]: I1209 15:13:35.129577 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.096428 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dfh9l" podUID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" containerName="registry-server" containerID="cri-o://1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700" gracePeriod=2 Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.458235 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.514163 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-catalog-content\") pod \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.514345 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-utilities\") pod \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.514487 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2bkm\" (UniqueName: \"kubernetes.io/projected/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-kube-api-access-l2bkm\") pod \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\" (UID: \"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9\") " Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.515353 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-utilities" (OuterVolumeSpecName: "utilities") pod "83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" (UID: "83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.523315 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-kube-api-access-l2bkm" (OuterVolumeSpecName: "kube-api-access-l2bkm") pod "83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" (UID: "83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9"). InnerVolumeSpecName "kube-api-access-l2bkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.558740 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" (UID: "83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.615642 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.615827 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.615909 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2bkm\" (UniqueName: \"kubernetes.io/projected/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9-kube-api-access-l2bkm\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:36 crc kubenswrapper[4735]: I1209 15:13:36.823121 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpflf"] Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.103664 4735 generic.go:334] "Generic (PLEG): container finished" podID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" containerID="1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700" exitCode=0 Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.103706 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfh9l" event={"ID":"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9","Type":"ContainerDied","Data":"1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700"} Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.103753 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dfh9l" event={"ID":"83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9","Type":"ContainerDied","Data":"5d824173d6eeb32e77889c0350144998de540fb00a3e136d17681b7b7742dec6"} Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.103773 4735 scope.go:117] "RemoveContainer" containerID="1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.103874 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dfh9l" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.103909 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vpflf" podUID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" containerName="registry-server" containerID="cri-o://a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d" gracePeriod=2 Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.125473 4735 scope.go:117] "RemoveContainer" containerID="510e43094ab5a77e43a72df618fda17501adf97274c7a68e1771b13b77e43d9f" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.137136 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dfh9l"] Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.140381 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dfh9l"] Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.194174 4735 scope.go:117] "RemoveContainer" containerID="d03b7f85866625b66931ef13746e188e851fd1948664967443107f3907253212" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.214088 4735 scope.go:117] "RemoveContainer" containerID="1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700" Dec 09 15:13:37 crc kubenswrapper[4735]: E1209 15:13:37.214501 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700\": container with ID starting with 1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700 not found: ID does not exist" containerID="1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.214552 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700"} err="failed to get container status \"1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700\": rpc error: code = NotFound desc = could not find container \"1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700\": container with ID starting with 1b4e15b66a1ca7f619cf6d234e98c203e95b87d2ee275f1d073b7760f0ba0700 not found: ID does not exist" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.214577 4735 scope.go:117] "RemoveContainer" containerID="510e43094ab5a77e43a72df618fda17501adf97274c7a68e1771b13b77e43d9f" Dec 09 15:13:37 crc kubenswrapper[4735]: E1209 15:13:37.214983 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"510e43094ab5a77e43a72df618fda17501adf97274c7a68e1771b13b77e43d9f\": container with ID starting with 510e43094ab5a77e43a72df618fda17501adf97274c7a68e1771b13b77e43d9f not found: ID does not exist" containerID="510e43094ab5a77e43a72df618fda17501adf97274c7a68e1771b13b77e43d9f" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.215027 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"510e43094ab5a77e43a72df618fda17501adf97274c7a68e1771b13b77e43d9f"} err="failed to get container status \"510e43094ab5a77e43a72df618fda17501adf97274c7a68e1771b13b77e43d9f\": rpc error: code = NotFound desc = could not find container \"510e43094ab5a77e43a72df618fda17501adf97274c7a68e1771b13b77e43d9f\": container with ID starting with 510e43094ab5a77e43a72df618fda17501adf97274c7a68e1771b13b77e43d9f not found: ID does not exist" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.215057 4735 scope.go:117] "RemoveContainer" containerID="d03b7f85866625b66931ef13746e188e851fd1948664967443107f3907253212" Dec 09 15:13:37 crc kubenswrapper[4735]: E1209 15:13:37.215461 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d03b7f85866625b66931ef13746e188e851fd1948664967443107f3907253212\": container with ID starting with d03b7f85866625b66931ef13746e188e851fd1948664967443107f3907253212 not found: ID does not exist" containerID="d03b7f85866625b66931ef13746e188e851fd1948664967443107f3907253212" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.215585 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03b7f85866625b66931ef13746e188e851fd1948664967443107f3907253212"} err="failed to get container status \"d03b7f85866625b66931ef13746e188e851fd1948664967443107f3907253212\": rpc error: code = NotFound desc = could not find container \"d03b7f85866625b66931ef13746e188e851fd1948664967443107f3907253212\": container with ID starting with d03b7f85866625b66931ef13746e188e851fd1948664967443107f3907253212 not found: ID does not exist" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.423100 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" path="/var/lib/kubelet/pods/83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9/volumes" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.452155 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.529716 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j7nl\" (UniqueName: \"kubernetes.io/projected/71e2df4b-2240-4a2a-877d-ee3f317a2e70-kube-api-access-8j7nl\") pod \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.529839 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-utilities\") pod \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.530043 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-catalog-content\") pod \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\" (UID: \"71e2df4b-2240-4a2a-877d-ee3f317a2e70\") " Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.530628 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-utilities" (OuterVolumeSpecName: "utilities") pod "71e2df4b-2240-4a2a-877d-ee3f317a2e70" (UID: "71e2df4b-2240-4a2a-877d-ee3f317a2e70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.535732 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71e2df4b-2240-4a2a-877d-ee3f317a2e70-kube-api-access-8j7nl" (OuterVolumeSpecName: "kube-api-access-8j7nl") pod "71e2df4b-2240-4a2a-877d-ee3f317a2e70" (UID: "71e2df4b-2240-4a2a-877d-ee3f317a2e70"). InnerVolumeSpecName "kube-api-access-8j7nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.545799 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71e2df4b-2240-4a2a-877d-ee3f317a2e70" (UID: "71e2df4b-2240-4a2a-877d-ee3f317a2e70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.631937 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.631968 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j7nl\" (UniqueName: \"kubernetes.io/projected/71e2df4b-2240-4a2a-877d-ee3f317a2e70-kube-api-access-8j7nl\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:37 crc kubenswrapper[4735]: I1209 15:13:37.631986 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71e2df4b-2240-4a2a-877d-ee3f317a2e70-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.114279 4735 generic.go:334] "Generic (PLEG): container finished" podID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" containerID="a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d" exitCode=0 Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.114334 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vpflf" Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.114338 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpflf" event={"ID":"71e2df4b-2240-4a2a-877d-ee3f317a2e70","Type":"ContainerDied","Data":"a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d"} Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.114429 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vpflf" event={"ID":"71e2df4b-2240-4a2a-877d-ee3f317a2e70","Type":"ContainerDied","Data":"fe5f50901cb86f9d48ffd8dfb591f7151289ed44acdc1dacaadbdbd7294be5dc"} Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.114465 4735 scope.go:117] "RemoveContainer" containerID="a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d" Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.128561 4735 scope.go:117] "RemoveContainer" containerID="a260423c0773bf7d8b43602a8097a15dc8db647fe2fbfd9772c325646c4b0e16" Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.142708 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpflf"] Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.146251 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vpflf"] Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.161079 4735 scope.go:117] "RemoveContainer" containerID="0d619734cfec34fb0c22b81e51b317152e30f6a6c9fa8a99ba177308d17c3a04" Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.173993 4735 scope.go:117] "RemoveContainer" containerID="a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d" Dec 09 15:13:38 crc kubenswrapper[4735]: E1209 15:13:38.174403 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d\": container with ID starting with a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d not found: ID does not exist" containerID="a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d" Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.174441 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d"} err="failed to get container status \"a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d\": rpc error: code = NotFound desc = could not find container \"a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d\": container with ID starting with a1e2a15b07ad9e09acf0d877425ebcf77010c77a1a693f6cb2bc709917ab152d not found: ID does not exist" Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.174465 4735 scope.go:117] "RemoveContainer" containerID="a260423c0773bf7d8b43602a8097a15dc8db647fe2fbfd9772c325646c4b0e16" Dec 09 15:13:38 crc kubenswrapper[4735]: E1209 15:13:38.174832 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a260423c0773bf7d8b43602a8097a15dc8db647fe2fbfd9772c325646c4b0e16\": container with ID starting with a260423c0773bf7d8b43602a8097a15dc8db647fe2fbfd9772c325646c4b0e16 not found: ID does not exist" containerID="a260423c0773bf7d8b43602a8097a15dc8db647fe2fbfd9772c325646c4b0e16" Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.174873 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a260423c0773bf7d8b43602a8097a15dc8db647fe2fbfd9772c325646c4b0e16"} err="failed to get container status \"a260423c0773bf7d8b43602a8097a15dc8db647fe2fbfd9772c325646c4b0e16\": rpc error: code = NotFound desc = could not find container \"a260423c0773bf7d8b43602a8097a15dc8db647fe2fbfd9772c325646c4b0e16\": container with ID starting with a260423c0773bf7d8b43602a8097a15dc8db647fe2fbfd9772c325646c4b0e16 not found: ID does not exist" Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.174907 4735 scope.go:117] "RemoveContainer" containerID="0d619734cfec34fb0c22b81e51b317152e30f6a6c9fa8a99ba177308d17c3a04" Dec 09 15:13:38 crc kubenswrapper[4735]: E1209 15:13:38.175222 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d619734cfec34fb0c22b81e51b317152e30f6a6c9fa8a99ba177308d17c3a04\": container with ID starting with 0d619734cfec34fb0c22b81e51b317152e30f6a6c9fa8a99ba177308d17c3a04 not found: ID does not exist" containerID="0d619734cfec34fb0c22b81e51b317152e30f6a6c9fa8a99ba177308d17c3a04" Dec 09 15:13:38 crc kubenswrapper[4735]: I1209 15:13:38.175247 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d619734cfec34fb0c22b81e51b317152e30f6a6c9fa8a99ba177308d17c3a04"} err="failed to get container status \"0d619734cfec34fb0c22b81e51b317152e30f6a6c9fa8a99ba177308d17c3a04\": rpc error: code = NotFound desc = could not find container \"0d619734cfec34fb0c22b81e51b317152e30f6a6c9fa8a99ba177308d17c3a04\": container with ID starting with 0d619734cfec34fb0c22b81e51b317152e30f6a6c9fa8a99ba177308d17c3a04 not found: ID does not exist" Dec 09 15:13:39 crc kubenswrapper[4735]: I1209 15:13:39.421045 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" path="/var/lib/kubelet/pods/71e2df4b-2240-4a2a-877d-ee3f317a2e70/volumes" Dec 09 15:13:48 crc kubenswrapper[4735]: I1209 15:13:48.023572 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-86bc98687f-dcjrx" Dec 09 15:14:04 crc kubenswrapper[4735]: I1209 15:14:04.335963 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:14:04 crc kubenswrapper[4735]: I1209 15:14:04.337365 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:14:04 crc kubenswrapper[4735]: I1209 15:14:04.337529 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 15:14:04 crc kubenswrapper[4735]: I1209 15:14:04.338022 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a22ce376c7f734447e3e64908bb1c07a5b4ae150029c792b81dee199a1f86208"} pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:14:04 crc kubenswrapper[4735]: I1209 15:14:04.338144 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" containerID="cri-o://a22ce376c7f734447e3e64908bb1c07a5b4ae150029c792b81dee199a1f86208" gracePeriod=600 Dec 09 15:14:05 crc kubenswrapper[4735]: I1209 15:14:05.280496 4735 generic.go:334] "Generic (PLEG): container finished" podID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerID="a22ce376c7f734447e3e64908bb1c07a5b4ae150029c792b81dee199a1f86208" exitCode=0 Dec 09 15:14:05 crc kubenswrapper[4735]: I1209 15:14:05.280561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerDied","Data":"a22ce376c7f734447e3e64908bb1c07a5b4ae150029c792b81dee199a1f86208"} Dec 09 15:14:05 crc kubenswrapper[4735]: I1209 15:14:05.281118 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"e3623d885b2b964e98770ce41e4b25f88400ebcd07eb10b6fc140eaa82071c9b"} Dec 09 15:14:05 crc kubenswrapper[4735]: I1209 15:14:05.281142 4735 scope.go:117] "RemoveContainer" containerID="32f296cb608e9d91aaf8195ce2837766de47464c288698a32b6b4cd28703999c" Dec 09 15:14:07 crc kubenswrapper[4735]: I1209 15:14:07.640298 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-577d56cdf4-lrqmp" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.167554 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp"] Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.168952 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" containerName="extract-content" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.168985 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" containerName="extract-content" Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.169014 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" containerName="registry-server" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.169022 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" containerName="registry-server" Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.169044 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" containerName="extract-utilities" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.169052 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" containerName="extract-utilities" Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.169074 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" containerName="registry-server" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.169085 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" containerName="registry-server" Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.169102 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" containerName="extract-utilities" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.169108 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" containerName="extract-utilities" Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.169123 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" containerName="extract-content" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.169132 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" containerName="extract-content" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.169397 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="83dd5ad1-ec47-49c6-b9d3-86ca3a6093d9" containerName="registry-server" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.169417 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="71e2df4b-2240-4a2a-877d-ee3f317a2e70" containerName="registry-server" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.170240 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.188163 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a837ca7-0878-4803-b74a-ffdde5633d0b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-v8jpp\" (UID: \"5a837ca7-0878-4803-b74a-ffdde5633d0b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.188220 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcknk\" (UniqueName: \"kubernetes.io/projected/5a837ca7-0878-4803-b74a-ffdde5633d0b-kube-api-access-rcknk\") pod \"frr-k8s-webhook-server-7fcb986d4-v8jpp\" (UID: \"5a837ca7-0878-4803-b74a-ffdde5633d0b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.189262 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-b9pmj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.189488 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.210847 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-x68zj"] Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.214142 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp"] Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.214255 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.216050 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.216447 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.247561 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8tx7p"] Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.248698 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.250505 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.250549 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-fdmbm" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.250695 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.250715 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.264927 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-s87j2"] Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.266400 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.268065 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.272434 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-s87j2"] Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.290594 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-frr-conf\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.290667 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-frr-sockets\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.290723 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-metrics\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.290751 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-metallb-excludel2\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.290820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-metrics-certs\") pod \"controller-f8648f98b-s87j2\" (UID: \"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c\") " pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.290927 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-memberlist\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.290982 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-cert\") pod \"controller-f8648f98b-s87j2\" (UID: \"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c\") " pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.291037 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a837ca7-0878-4803-b74a-ffdde5633d0b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-v8jpp\" (UID: \"5a837ca7-0878-4803-b74a-ffdde5633d0b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.291064 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcknk\" (UniqueName: \"kubernetes.io/projected/5a837ca7-0878-4803-b74a-ffdde5633d0b-kube-api-access-rcknk\") pod \"frr-k8s-webhook-server-7fcb986d4-v8jpp\" (UID: \"5a837ca7-0878-4803-b74a-ffdde5633d0b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.291109 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2zzf\" (UniqueName: \"kubernetes.io/projected/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-kube-api-access-p2zzf\") pod \"controller-f8648f98b-s87j2\" (UID: \"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c\") " pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.291135 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2963bb65-bcd6-473e-a13a-fa1413c7564e-frr-startup\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.291163 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-metrics-certs\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.291180 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sppk8\" (UniqueName: \"kubernetes.io/projected/2963bb65-bcd6-473e-a13a-fa1413c7564e-kube-api-access-sppk8\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.291221 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh2cw\" (UniqueName: \"kubernetes.io/projected/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-kube-api-access-qh2cw\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.291256 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2963bb65-bcd6-473e-a13a-fa1413c7564e-metrics-certs\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.291300 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-reloader\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.291420 4735 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.291469 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a837ca7-0878-4803-b74a-ffdde5633d0b-cert podName:5a837ca7-0878-4803-b74a-ffdde5633d0b nodeName:}" failed. No retries permitted until 2025-12-09 15:14:08.791452562 +0000 UTC m=+927.716291191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5a837ca7-0878-4803-b74a-ffdde5633d0b-cert") pod "frr-k8s-webhook-server-7fcb986d4-v8jpp" (UID: "5a837ca7-0878-4803-b74a-ffdde5633d0b") : secret "frr-k8s-webhook-server-cert" not found Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.310661 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcknk\" (UniqueName: \"kubernetes.io/projected/5a837ca7-0878-4803-b74a-ffdde5633d0b-kube-api-access-rcknk\") pod \"frr-k8s-webhook-server-7fcb986d4-v8jpp\" (UID: \"5a837ca7-0878-4803-b74a-ffdde5633d0b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.391802 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2963bb65-bcd6-473e-a13a-fa1413c7564e-metrics-certs\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.391864 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-reloader\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.391893 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-frr-sockets\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.391911 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-frr-conf\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.391937 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-metrics\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.391956 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-metallb-excludel2\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.391999 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-metrics-certs\") pod \"controller-f8648f98b-s87j2\" (UID: \"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c\") " pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.392032 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-memberlist\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.392052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-cert\") pod \"controller-f8648f98b-s87j2\" (UID: \"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c\") " pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.392111 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2zzf\" (UniqueName: \"kubernetes.io/projected/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-kube-api-access-p2zzf\") pod \"controller-f8648f98b-s87j2\" (UID: \"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c\") " pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.392136 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2963bb65-bcd6-473e-a13a-fa1413c7564e-frr-startup\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.392156 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-metrics-certs\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.392176 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sppk8\" (UniqueName: \"kubernetes.io/projected/2963bb65-bcd6-473e-a13a-fa1413c7564e-kube-api-access-sppk8\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.392204 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh2cw\" (UniqueName: \"kubernetes.io/projected/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-kube-api-access-qh2cw\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.392203 4735 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.392228 4735 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.392299 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-memberlist podName:f8fafb99-3897-41e7-a1b7-aca9fce8ceae nodeName:}" failed. No retries permitted until 2025-12-09 15:14:08.892273085 +0000 UTC m=+927.817111714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-memberlist") pod "speaker-8tx7p" (UID: "f8fafb99-3897-41e7-a1b7-aca9fce8ceae") : secret "metallb-memberlist" not found Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.392325 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-metrics-certs podName:b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c nodeName:}" failed. No retries permitted until 2025-12-09 15:14:08.892315335 +0000 UTC m=+927.817153964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-metrics-certs") pod "controller-f8648f98b-s87j2" (UID: "b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c") : secret "controller-certs-secret" not found Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.392356 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-frr-sockets\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.392390 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-metrics\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.392851 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-reloader\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.392892 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-metallb-excludel2\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.393292 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2963bb65-bcd6-473e-a13a-fa1413c7564e-frr-startup\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.393548 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2963bb65-bcd6-473e-a13a-fa1413c7564e-frr-conf\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.393944 4735 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.396112 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2963bb65-bcd6-473e-a13a-fa1413c7564e-metrics-certs\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.396814 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-metrics-certs\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.405956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sppk8\" (UniqueName: \"kubernetes.io/projected/2963bb65-bcd6-473e-a13a-fa1413c7564e-kube-api-access-sppk8\") pod \"frr-k8s-x68zj\" (UID: \"2963bb65-bcd6-473e-a13a-fa1413c7564e\") " pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.406419 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-cert\") pod \"controller-f8648f98b-s87j2\" (UID: \"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c\") " pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.406656 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2zzf\" (UniqueName: \"kubernetes.io/projected/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-kube-api-access-p2zzf\") pod \"controller-f8648f98b-s87j2\" (UID: \"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c\") " pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.410848 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh2cw\" (UniqueName: \"kubernetes.io/projected/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-kube-api-access-qh2cw\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.526976 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.799843 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a837ca7-0878-4803-b74a-ffdde5633d0b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-v8jpp\" (UID: \"5a837ca7-0878-4803-b74a-ffdde5633d0b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.804739 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a837ca7-0878-4803-b74a-ffdde5633d0b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-v8jpp\" (UID: \"5a837ca7-0878-4803-b74a-ffdde5633d0b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.815732 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.901124 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-metrics-certs\") pod \"controller-f8648f98b-s87j2\" (UID: \"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c\") " pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.901188 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-memberlist\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.901495 4735 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 09 15:14:08 crc kubenswrapper[4735]: E1209 15:14:08.901638 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-memberlist podName:f8fafb99-3897-41e7-a1b7-aca9fce8ceae nodeName:}" failed. No retries permitted until 2025-12-09 15:14:09.901609465 +0000 UTC m=+928.826448093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-memberlist") pod "speaker-8tx7p" (UID: "f8fafb99-3897-41e7-a1b7-aca9fce8ceae") : secret "metallb-memberlist" not found Dec 09 15:14:08 crc kubenswrapper[4735]: I1209 15:14:08.906564 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c-metrics-certs\") pod \"controller-f8648f98b-s87j2\" (UID: \"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c\") " pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:09 crc kubenswrapper[4735]: I1209 15:14:09.177572 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:09 crc kubenswrapper[4735]: I1209 15:14:09.198428 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp"] Dec 09 15:14:09 crc kubenswrapper[4735]: W1209 15:14:09.200468 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a837ca7_0878_4803_b74a_ffdde5633d0b.slice/crio-c8c717980f63a556946213b3cb4680f782a78481eb4cd88f976dc28abddf79bd WatchSource:0}: Error finding container c8c717980f63a556946213b3cb4680f782a78481eb4cd88f976dc28abddf79bd: Status 404 returned error can't find the container with id c8c717980f63a556946213b3cb4680f782a78481eb4cd88f976dc28abddf79bd Dec 09 15:14:09 crc kubenswrapper[4735]: I1209 15:14:09.305669 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" event={"ID":"5a837ca7-0878-4803-b74a-ffdde5633d0b","Type":"ContainerStarted","Data":"c8c717980f63a556946213b3cb4680f782a78481eb4cd88f976dc28abddf79bd"} Dec 09 15:14:09 crc kubenswrapper[4735]: I1209 15:14:09.306995 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x68zj" event={"ID":"2963bb65-bcd6-473e-a13a-fa1413c7564e","Type":"ContainerStarted","Data":"6f0759ff5933fd072706606ad555e202c81b6060afb8ca76b2a89c070569daff"} Dec 09 15:14:09 crc kubenswrapper[4735]: W1209 15:14:09.537370 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1d2fdb7_18ab_4723_aa7f_1f1b4e2d527c.slice/crio-a9816007f350ebb1c68dd11d14d1440a08dd7b33655abe5f50de2b682efc3496 WatchSource:0}: Error finding container a9816007f350ebb1c68dd11d14d1440a08dd7b33655abe5f50de2b682efc3496: Status 404 returned error can't find the container with id a9816007f350ebb1c68dd11d14d1440a08dd7b33655abe5f50de2b682efc3496 Dec 09 15:14:09 crc kubenswrapper[4735]: I1209 15:14:09.537792 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-s87j2"] Dec 09 15:14:09 crc kubenswrapper[4735]: I1209 15:14:09.916993 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-memberlist\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:09 crc kubenswrapper[4735]: I1209 15:14:09.923261 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f8fafb99-3897-41e7-a1b7-aca9fce8ceae-memberlist\") pod \"speaker-8tx7p\" (UID: \"f8fafb99-3897-41e7-a1b7-aca9fce8ceae\") " pod="metallb-system/speaker-8tx7p" Dec 09 15:14:10 crc kubenswrapper[4735]: I1209 15:14:10.065995 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8tx7p" Dec 09 15:14:10 crc kubenswrapper[4735]: W1209 15:14:10.086837 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8fafb99_3897_41e7_a1b7_aca9fce8ceae.slice/crio-5df70bbf2ee99d47852a7bbc77832ac9967e58669669c50843f1bf6ef5056a61 WatchSource:0}: Error finding container 5df70bbf2ee99d47852a7bbc77832ac9967e58669669c50843f1bf6ef5056a61: Status 404 returned error can't find the container with id 5df70bbf2ee99d47852a7bbc77832ac9967e58669669c50843f1bf6ef5056a61 Dec 09 15:14:10 crc kubenswrapper[4735]: I1209 15:14:10.324174 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-s87j2" event={"ID":"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c","Type":"ContainerStarted","Data":"c5e58bf342f1208baef15154a2734f6df4480f0160ca55f30d14ead8ba5bced7"} Dec 09 15:14:10 crc kubenswrapper[4735]: I1209 15:14:10.324250 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-s87j2" event={"ID":"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c","Type":"ContainerStarted","Data":"239a66d4cde1bfe6f96599cc2bc9d9e18a23fcdbe8d7e4c177215d7f409edbed"} Dec 09 15:14:10 crc kubenswrapper[4735]: I1209 15:14:10.324263 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-s87j2" event={"ID":"b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c","Type":"ContainerStarted","Data":"a9816007f350ebb1c68dd11d14d1440a08dd7b33655abe5f50de2b682efc3496"} Dec 09 15:14:10 crc kubenswrapper[4735]: I1209 15:14:10.324372 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:10 crc kubenswrapper[4735]: I1209 15:14:10.326087 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8tx7p" event={"ID":"f8fafb99-3897-41e7-a1b7-aca9fce8ceae","Type":"ContainerStarted","Data":"5df70bbf2ee99d47852a7bbc77832ac9967e58669669c50843f1bf6ef5056a61"} Dec 09 15:14:10 crc kubenswrapper[4735]: I1209 15:14:10.340863 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-s87j2" podStartSLOduration=2.340844295 podStartE2EDuration="2.340844295s" podCreationTimestamp="2025-12-09 15:14:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:14:10.340467186 +0000 UTC m=+929.265305814" watchObservedRunningTime="2025-12-09 15:14:10.340844295 +0000 UTC m=+929.265682923" Dec 09 15:14:11 crc kubenswrapper[4735]: I1209 15:14:11.333498 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8tx7p" event={"ID":"f8fafb99-3897-41e7-a1b7-aca9fce8ceae","Type":"ContainerStarted","Data":"278d1d80fb4e841c4308246ecae76160aeea91c6f40add42a8d0120c8db65c5d"} Dec 09 15:14:11 crc kubenswrapper[4735]: I1209 15:14:11.333578 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8tx7p" event={"ID":"f8fafb99-3897-41e7-a1b7-aca9fce8ceae","Type":"ContainerStarted","Data":"76a58045c9263d799fe5b6e631f58955bb0e65befba67138e9a3547d8bc2a8b8"} Dec 09 15:14:11 crc kubenswrapper[4735]: I1209 15:14:11.333646 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8tx7p" Dec 09 15:14:11 crc kubenswrapper[4735]: I1209 15:14:11.353211 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8tx7p" podStartSLOduration=3.353196216 podStartE2EDuration="3.353196216s" podCreationTimestamp="2025-12-09 15:14:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-09 15:14:11.348319747 +0000 UTC m=+930.273158374" watchObservedRunningTime="2025-12-09 15:14:11.353196216 +0000 UTC m=+930.278034845" Dec 09 15:14:15 crc kubenswrapper[4735]: I1209 15:14:15.365218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" event={"ID":"5a837ca7-0878-4803-b74a-ffdde5633d0b","Type":"ContainerStarted","Data":"6a43faa401a3648d2ce1d7535ff680967e8d27864859846fd3e6270015669715"} Dec 09 15:14:15 crc kubenswrapper[4735]: I1209 15:14:15.365481 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" Dec 09 15:14:15 crc kubenswrapper[4735]: I1209 15:14:15.379294 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" podStartSLOduration=1.5122415139999998 podStartE2EDuration="7.379275376s" podCreationTimestamp="2025-12-09 15:14:08 +0000 UTC" firstStartedPulling="2025-12-09 15:14:09.202386475 +0000 UTC m=+928.127225103" lastFinishedPulling="2025-12-09 15:14:15.069420337 +0000 UTC m=+933.994258965" observedRunningTime="2025-12-09 15:14:15.376247776 +0000 UTC m=+934.301086405" watchObservedRunningTime="2025-12-09 15:14:15.379275376 +0000 UTC m=+934.304114003" Dec 09 15:14:16 crc kubenswrapper[4735]: I1209 15:14:16.373527 4735 generic.go:334] "Generic (PLEG): container finished" podID="2963bb65-bcd6-473e-a13a-fa1413c7564e" containerID="5b21153d43c9606763701a5f4f7b8278f51ccc263c5973a71c4ee3b1701332f2" exitCode=0 Dec 09 15:14:16 crc kubenswrapper[4735]: I1209 15:14:16.373583 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x68zj" event={"ID":"2963bb65-bcd6-473e-a13a-fa1413c7564e","Type":"ContainerDied","Data":"5b21153d43c9606763701a5f4f7b8278f51ccc263c5973a71c4ee3b1701332f2"} Dec 09 15:14:17 crc kubenswrapper[4735]: I1209 15:14:17.383414 4735 generic.go:334] "Generic (PLEG): container finished" podID="2963bb65-bcd6-473e-a13a-fa1413c7564e" containerID="8f768f473aa322a3918617d13b2fe565ce2c312de9ccede595ce9784e78184e2" exitCode=0 Dec 09 15:14:17 crc kubenswrapper[4735]: I1209 15:14:17.383634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x68zj" event={"ID":"2963bb65-bcd6-473e-a13a-fa1413c7564e","Type":"ContainerDied","Data":"8f768f473aa322a3918617d13b2fe565ce2c312de9ccede595ce9784e78184e2"} Dec 09 15:14:18 crc kubenswrapper[4735]: I1209 15:14:18.392001 4735 generic.go:334] "Generic (PLEG): container finished" podID="2963bb65-bcd6-473e-a13a-fa1413c7564e" containerID="ca64ed7eeb9c4a24eb9ab785668fd2c8b2d9d3b719adf072dc4d3a20324b233d" exitCode=0 Dec 09 15:14:18 crc kubenswrapper[4735]: I1209 15:14:18.392050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x68zj" event={"ID":"2963bb65-bcd6-473e-a13a-fa1413c7564e","Type":"ContainerDied","Data":"ca64ed7eeb9c4a24eb9ab785668fd2c8b2d9d3b719adf072dc4d3a20324b233d"} Dec 09 15:14:19 crc kubenswrapper[4735]: I1209 15:14:19.181997 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-s87j2" Dec 09 15:14:19 crc kubenswrapper[4735]: I1209 15:14:19.403347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x68zj" event={"ID":"2963bb65-bcd6-473e-a13a-fa1413c7564e","Type":"ContainerStarted","Data":"1ceee44d45b3bb427f999bf879acb3d0c032ea7046986b52c6db90169de147f8"} Dec 09 15:14:19 crc kubenswrapper[4735]: I1209 15:14:19.403408 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x68zj" event={"ID":"2963bb65-bcd6-473e-a13a-fa1413c7564e","Type":"ContainerStarted","Data":"fc3a331c2701c19b6c6772c2d923e3e86089cb6af82f4e1aca192b6663f9f95b"} Dec 09 15:14:19 crc kubenswrapper[4735]: I1209 15:14:19.403423 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x68zj" event={"ID":"2963bb65-bcd6-473e-a13a-fa1413c7564e","Type":"ContainerStarted","Data":"adbea28b7faeb5c86c919ae0418bf3be264d112f79122ec79abd3d7e32d2a71a"} Dec 09 15:14:19 crc kubenswrapper[4735]: I1209 15:14:19.403432 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x68zj" event={"ID":"2963bb65-bcd6-473e-a13a-fa1413c7564e","Type":"ContainerStarted","Data":"e97bf78a1698ea617df327d559c91ad8f919327956462e918edf41757eef2318"} Dec 09 15:14:19 crc kubenswrapper[4735]: I1209 15:14:19.403443 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x68zj" event={"ID":"2963bb65-bcd6-473e-a13a-fa1413c7564e","Type":"ContainerStarted","Data":"dcd67be54cc40a6492ff9bbb16ea55c28312df17307b57212bfefe5c6fe4ec42"} Dec 09 15:14:19 crc kubenswrapper[4735]: I1209 15:14:19.403451 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x68zj" event={"ID":"2963bb65-bcd6-473e-a13a-fa1413c7564e","Type":"ContainerStarted","Data":"4bf08b3cc4809cfb1c48211ef310692801110617108e10d8925123d15b9acb2d"} Dec 09 15:14:19 crc kubenswrapper[4735]: I1209 15:14:19.403585 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:19 crc kubenswrapper[4735]: I1209 15:14:19.429254 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-x68zj" podStartSLOduration=4.804772769 podStartE2EDuration="11.429224096s" podCreationTimestamp="2025-12-09 15:14:08 +0000 UTC" firstStartedPulling="2025-12-09 15:14:08.666430927 +0000 UTC m=+927.591269556" lastFinishedPulling="2025-12-09 15:14:15.290882254 +0000 UTC m=+934.215720883" observedRunningTime="2025-12-09 15:14:19.423508448 +0000 UTC m=+938.348347076" watchObservedRunningTime="2025-12-09 15:14:19.429224096 +0000 UTC m=+938.354062724" Dec 09 15:14:20 crc kubenswrapper[4735]: I1209 15:14:20.070245 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8tx7p" Dec 09 15:14:22 crc kubenswrapper[4735]: I1209 15:14:22.306838 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jd552"] Dec 09 15:14:22 crc kubenswrapper[4735]: I1209 15:14:22.308220 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jd552" Dec 09 15:14:22 crc kubenswrapper[4735]: I1209 15:14:22.311658 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2czs2" Dec 09 15:14:22 crc kubenswrapper[4735]: I1209 15:14:22.311915 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 09 15:14:22 crc kubenswrapper[4735]: I1209 15:14:22.322362 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 09 15:14:22 crc kubenswrapper[4735]: I1209 15:14:22.337403 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jd552"] Dec 09 15:14:22 crc kubenswrapper[4735]: I1209 15:14:22.442880 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5vq8\" (UniqueName: \"kubernetes.io/projected/8839fda6-42ea-479b-a9bf-3a27e1dd44cb-kube-api-access-s5vq8\") pod \"openstack-operator-index-jd552\" (UID: \"8839fda6-42ea-479b-a9bf-3a27e1dd44cb\") " pod="openstack-operators/openstack-operator-index-jd552" Dec 09 15:14:22 crc kubenswrapper[4735]: I1209 15:14:22.544583 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5vq8\" (UniqueName: \"kubernetes.io/projected/8839fda6-42ea-479b-a9bf-3a27e1dd44cb-kube-api-access-s5vq8\") pod \"openstack-operator-index-jd552\" (UID: \"8839fda6-42ea-479b-a9bf-3a27e1dd44cb\") " pod="openstack-operators/openstack-operator-index-jd552" Dec 09 15:14:22 crc kubenswrapper[4735]: I1209 15:14:22.561821 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5vq8\" (UniqueName: \"kubernetes.io/projected/8839fda6-42ea-479b-a9bf-3a27e1dd44cb-kube-api-access-s5vq8\") pod \"openstack-operator-index-jd552\" (UID: \"8839fda6-42ea-479b-a9bf-3a27e1dd44cb\") " pod="openstack-operators/openstack-operator-index-jd552" Dec 09 15:14:22 crc kubenswrapper[4735]: I1209 15:14:22.638627 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jd552" Dec 09 15:14:23 crc kubenswrapper[4735]: I1209 15:14:23.063570 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jd552"] Dec 09 15:14:23 crc kubenswrapper[4735]: W1209 15:14:23.073993 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8839fda6_42ea_479b_a9bf_3a27e1dd44cb.slice/crio-151b60d9c6fe904b055f8cf480caa0aba3102a2922d7765b3f8bb24cb5b5ade1 WatchSource:0}: Error finding container 151b60d9c6fe904b055f8cf480caa0aba3102a2922d7765b3f8bb24cb5b5ade1: Status 404 returned error can't find the container with id 151b60d9c6fe904b055f8cf480caa0aba3102a2922d7765b3f8bb24cb5b5ade1 Dec 09 15:14:23 crc kubenswrapper[4735]: I1209 15:14:23.437498 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jd552" event={"ID":"8839fda6-42ea-479b-a9bf-3a27e1dd44cb","Type":"ContainerStarted","Data":"151b60d9c6fe904b055f8cf480caa0aba3102a2922d7765b3f8bb24cb5b5ade1"} Dec 09 15:14:23 crc kubenswrapper[4735]: I1209 15:14:23.527313 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:23 crc kubenswrapper[4735]: I1209 15:14:23.558495 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:25 crc kubenswrapper[4735]: I1209 15:14:25.687110 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jd552"] Dec 09 15:14:26 crc kubenswrapper[4735]: I1209 15:14:26.294013 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jzq8f"] Dec 09 15:14:26 crc kubenswrapper[4735]: I1209 15:14:26.294775 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jzq8f" Dec 09 15:14:26 crc kubenswrapper[4735]: I1209 15:14:26.303344 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jzq8f"] Dec 09 15:14:26 crc kubenswrapper[4735]: I1209 15:14:26.399909 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxcxg\" (UniqueName: \"kubernetes.io/projected/9bfb1b1a-87c7-4fa5-ad02-935f53dbc081-kube-api-access-hxcxg\") pod \"openstack-operator-index-jzq8f\" (UID: \"9bfb1b1a-87c7-4fa5-ad02-935f53dbc081\") " pod="openstack-operators/openstack-operator-index-jzq8f" Dec 09 15:14:26 crc kubenswrapper[4735]: I1209 15:14:26.502749 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxcxg\" (UniqueName: \"kubernetes.io/projected/9bfb1b1a-87c7-4fa5-ad02-935f53dbc081-kube-api-access-hxcxg\") pod \"openstack-operator-index-jzq8f\" (UID: \"9bfb1b1a-87c7-4fa5-ad02-935f53dbc081\") " pod="openstack-operators/openstack-operator-index-jzq8f" Dec 09 15:14:26 crc kubenswrapper[4735]: I1209 15:14:26.520807 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxcxg\" (UniqueName: \"kubernetes.io/projected/9bfb1b1a-87c7-4fa5-ad02-935f53dbc081-kube-api-access-hxcxg\") pod \"openstack-operator-index-jzq8f\" (UID: \"9bfb1b1a-87c7-4fa5-ad02-935f53dbc081\") " pod="openstack-operators/openstack-operator-index-jzq8f" Dec 09 15:14:26 crc kubenswrapper[4735]: I1209 15:14:26.609483 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jzq8f" Dec 09 15:14:26 crc kubenswrapper[4735]: I1209 15:14:26.998272 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jzq8f"] Dec 09 15:14:27 crc kubenswrapper[4735]: I1209 15:14:27.466991 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jzq8f" event={"ID":"9bfb1b1a-87c7-4fa5-ad02-935f53dbc081","Type":"ContainerStarted","Data":"fa770e326b097b40fe7fa23045adcc16f72bd01bfd401e5177b944adc29abcfd"} Dec 09 15:14:28 crc kubenswrapper[4735]: I1209 15:14:28.530447 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-x68zj" Dec 09 15:14:28 crc kubenswrapper[4735]: I1209 15:14:28.821648 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-v8jpp" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.140737 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5"] Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.142425 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.144274 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.146624 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.149679 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5"] Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.298984 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5225b0ad-49dd-404e-8575-00bc9146cb62-config-volume\") pod \"collect-profiles-29421555-5fhw5\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.299117 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5225b0ad-49dd-404e-8575-00bc9146cb62-secret-volume\") pod \"collect-profiles-29421555-5fhw5\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.299201 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8xw6\" (UniqueName: \"kubernetes.io/projected/5225b0ad-49dd-404e-8575-00bc9146cb62-kube-api-access-p8xw6\") pod \"collect-profiles-29421555-5fhw5\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.400337 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5225b0ad-49dd-404e-8575-00bc9146cb62-config-volume\") pod \"collect-profiles-29421555-5fhw5\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.400429 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5225b0ad-49dd-404e-8575-00bc9146cb62-secret-volume\") pod \"collect-profiles-29421555-5fhw5\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.400524 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8xw6\" (UniqueName: \"kubernetes.io/projected/5225b0ad-49dd-404e-8575-00bc9146cb62-kube-api-access-p8xw6\") pod \"collect-profiles-29421555-5fhw5\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.401345 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5225b0ad-49dd-404e-8575-00bc9146cb62-config-volume\") pod \"collect-profiles-29421555-5fhw5\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.405618 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5225b0ad-49dd-404e-8575-00bc9146cb62-secret-volume\") pod \"collect-profiles-29421555-5fhw5\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.414653 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8xw6\" (UniqueName: \"kubernetes.io/projected/5225b0ad-49dd-404e-8575-00bc9146cb62-kube-api-access-p8xw6\") pod \"collect-profiles-29421555-5fhw5\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.458987 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:00 crc kubenswrapper[4735]: I1209 15:15:00.860082 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5"] Dec 09 15:15:01 crc kubenswrapper[4735]: I1209 15:15:01.659109 4735 generic.go:334] "Generic (PLEG): container finished" podID="5225b0ad-49dd-404e-8575-00bc9146cb62" containerID="279b9695b09439bc4aae667f0d8e8f38df3b2fac132178c0174ab13faf9c42ba" exitCode=0 Dec 09 15:15:01 crc kubenswrapper[4735]: I1209 15:15:01.659164 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" event={"ID":"5225b0ad-49dd-404e-8575-00bc9146cb62","Type":"ContainerDied","Data":"279b9695b09439bc4aae667f0d8e8f38df3b2fac132178c0174ab13faf9c42ba"} Dec 09 15:15:01 crc kubenswrapper[4735]: I1209 15:15:01.659474 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" event={"ID":"5225b0ad-49dd-404e-8575-00bc9146cb62","Type":"ContainerStarted","Data":"7e4c50fa7fd030d99a5879fcbeb41a7ff7f839e7e08d9ff962dbcbaa74b4fdf5"} Dec 09 15:15:02 crc kubenswrapper[4735]: I1209 15:15:02.878548 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.035718 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8xw6\" (UniqueName: \"kubernetes.io/projected/5225b0ad-49dd-404e-8575-00bc9146cb62-kube-api-access-p8xw6\") pod \"5225b0ad-49dd-404e-8575-00bc9146cb62\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.035884 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5225b0ad-49dd-404e-8575-00bc9146cb62-config-volume\") pod \"5225b0ad-49dd-404e-8575-00bc9146cb62\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.035916 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5225b0ad-49dd-404e-8575-00bc9146cb62-secret-volume\") pod \"5225b0ad-49dd-404e-8575-00bc9146cb62\" (UID: \"5225b0ad-49dd-404e-8575-00bc9146cb62\") " Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.036900 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5225b0ad-49dd-404e-8575-00bc9146cb62-config-volume" (OuterVolumeSpecName: "config-volume") pod "5225b0ad-49dd-404e-8575-00bc9146cb62" (UID: "5225b0ad-49dd-404e-8575-00bc9146cb62"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.040422 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5225b0ad-49dd-404e-8575-00bc9146cb62-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5225b0ad-49dd-404e-8575-00bc9146cb62" (UID: "5225b0ad-49dd-404e-8575-00bc9146cb62"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.040642 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225b0ad-49dd-404e-8575-00bc9146cb62-kube-api-access-p8xw6" (OuterVolumeSpecName: "kube-api-access-p8xw6") pod "5225b0ad-49dd-404e-8575-00bc9146cb62" (UID: "5225b0ad-49dd-404e-8575-00bc9146cb62"). InnerVolumeSpecName "kube-api-access-p8xw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.137804 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8xw6\" (UniqueName: \"kubernetes.io/projected/5225b0ad-49dd-404e-8575-00bc9146cb62-kube-api-access-p8xw6\") on node \"crc\" DevicePath \"\"" Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.137851 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5225b0ad-49dd-404e-8575-00bc9146cb62-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.137861 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5225b0ad-49dd-404e-8575-00bc9146cb62-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.671303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" event={"ID":"5225b0ad-49dd-404e-8575-00bc9146cb62","Type":"ContainerDied","Data":"7e4c50fa7fd030d99a5879fcbeb41a7ff7f839e7e08d9ff962dbcbaa74b4fdf5"} Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.671346 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e4c50fa7fd030d99a5879fcbeb41a7ff7f839e7e08d9ff962dbcbaa74b4fdf5" Dec 09 15:15:03 crc kubenswrapper[4735]: I1209 15:15:03.671407 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421555-5fhw5" Dec 09 15:16:04 crc kubenswrapper[4735]: I1209 15:16:04.336264 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:16:04 crc kubenswrapper[4735]: I1209 15:16:04.336893 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:16:23 crc kubenswrapper[4735]: E1209 15:16:23.083060 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:16:23 crc kubenswrapper[4735]: E1209 15:16:23.083661 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:16:23 crc kubenswrapper[4735]: E1209 15:16:23.083818 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s5vq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-jd552_openstack-operators(8839fda6-42ea-479b-a9bf-3a27e1dd44cb): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:16:23 crc kubenswrapper[4735]: E1209 15:16:23.085137 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-jd552" podUID="8839fda6-42ea-479b-a9bf-3a27e1dd44cb" Dec 09 15:16:23 crc kubenswrapper[4735]: I1209 15:16:23.375193 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jd552" Dec 09 15:16:23 crc kubenswrapper[4735]: I1209 15:16:23.437575 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5vq8\" (UniqueName: \"kubernetes.io/projected/8839fda6-42ea-479b-a9bf-3a27e1dd44cb-kube-api-access-s5vq8\") pod \"8839fda6-42ea-479b-a9bf-3a27e1dd44cb\" (UID: \"8839fda6-42ea-479b-a9bf-3a27e1dd44cb\") " Dec 09 15:16:23 crc kubenswrapper[4735]: I1209 15:16:23.444707 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8839fda6-42ea-479b-a9bf-3a27e1dd44cb-kube-api-access-s5vq8" (OuterVolumeSpecName: "kube-api-access-s5vq8") pod "8839fda6-42ea-479b-a9bf-3a27e1dd44cb" (UID: "8839fda6-42ea-479b-a9bf-3a27e1dd44cb"). InnerVolumeSpecName "kube-api-access-s5vq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:16:23 crc kubenswrapper[4735]: I1209 15:16:23.538552 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5vq8\" (UniqueName: \"kubernetes.io/projected/8839fda6-42ea-479b-a9bf-3a27e1dd44cb-kube-api-access-s5vq8\") on node \"crc\" DevicePath \"\"" Dec 09 15:16:24 crc kubenswrapper[4735]: I1209 15:16:24.172622 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jd552" event={"ID":"8839fda6-42ea-479b-a9bf-3a27e1dd44cb","Type":"ContainerDied","Data":"151b60d9c6fe904b055f8cf480caa0aba3102a2922d7765b3f8bb24cb5b5ade1"} Dec 09 15:16:24 crc kubenswrapper[4735]: I1209 15:16:24.172708 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jd552" Dec 09 15:16:24 crc kubenswrapper[4735]: I1209 15:16:24.205082 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jd552"] Dec 09 15:16:24 crc kubenswrapper[4735]: I1209 15:16:24.208347 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jd552"] Dec 09 15:16:25 crc kubenswrapper[4735]: I1209 15:16:25.423236 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8839fda6-42ea-479b-a9bf-3a27e1dd44cb" path="/var/lib/kubelet/pods/8839fda6-42ea-479b-a9bf-3a27e1dd44cb/volumes" Dec 09 15:16:27 crc kubenswrapper[4735]: E1209 15:16:27.015129 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:16:27 crc kubenswrapper[4735]: E1209 15:16:27.015184 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:16:27 crc kubenswrapper[4735]: E1209 15:16:27.015335 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hxcxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-jzq8f_openstack-operators(9bfb1b1a-87c7-4fa5-ad02-935f53dbc081): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:16:27 crc kubenswrapper[4735]: E1209 15:16:27.016501 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:16:27 crc kubenswrapper[4735]: E1209 15:16:27.195104 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:16:34 crc kubenswrapper[4735]: I1209 15:16:34.335916 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:16:34 crc kubenswrapper[4735]: I1209 15:16:34.336459 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:17:04 crc kubenswrapper[4735]: I1209 15:17:04.336253 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:17:04 crc kubenswrapper[4735]: I1209 15:17:04.336766 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:17:04 crc kubenswrapper[4735]: I1209 15:17:04.336816 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 15:17:04 crc kubenswrapper[4735]: I1209 15:17:04.337275 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e3623d885b2b964e98770ce41e4b25f88400ebcd07eb10b6fc140eaa82071c9b"} pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:17:04 crc kubenswrapper[4735]: I1209 15:17:04.337326 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" containerID="cri-o://e3623d885b2b964e98770ce41e4b25f88400ebcd07eb10b6fc140eaa82071c9b" gracePeriod=600 Dec 09 15:17:05 crc kubenswrapper[4735]: I1209 15:17:05.394371 4735 generic.go:334] "Generic (PLEG): container finished" podID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerID="e3623d885b2b964e98770ce41e4b25f88400ebcd07eb10b6fc140eaa82071c9b" exitCode=0 Dec 09 15:17:05 crc kubenswrapper[4735]: I1209 15:17:05.394455 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerDied","Data":"e3623d885b2b964e98770ce41e4b25f88400ebcd07eb10b6fc140eaa82071c9b"} Dec 09 15:17:05 crc kubenswrapper[4735]: I1209 15:17:05.394655 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"e1979881ab15f986f5604820cd914ee70602c888f5970309366c4dd6823a3f20"} Dec 09 15:17:05 crc kubenswrapper[4735]: I1209 15:17:05.394681 4735 scope.go:117] "RemoveContainer" containerID="a22ce376c7f734447e3e64908bb1c07a5b4ae150029c792b81dee199a1f86208" Dec 09 15:18:40 crc kubenswrapper[4735]: E1209 15:18:40.421680 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:18:40 crc kubenswrapper[4735]: E1209 15:18:40.421994 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:18:40 crc kubenswrapper[4735]: E1209 15:18:40.422130 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hxcxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-jzq8f_openstack-operators(9bfb1b1a-87c7-4fa5-ad02-935f53dbc081): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:18:40 crc kubenswrapper[4735]: E1209 15:18:40.423304 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:18:54 crc kubenswrapper[4735]: E1209 15:18:54.415222 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:19:04 crc kubenswrapper[4735]: I1209 15:19:04.336120 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:19:04 crc kubenswrapper[4735]: I1209 15:19:04.337192 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:19:07 crc kubenswrapper[4735]: I1209 15:19:07.416431 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:19:34 crc kubenswrapper[4735]: I1209 15:19:34.336153 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:19:34 crc kubenswrapper[4735]: I1209 15:19:34.336564 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:20:04 crc kubenswrapper[4735]: I1209 15:20:04.336302 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:20:04 crc kubenswrapper[4735]: I1209 15:20:04.337179 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:20:04 crc kubenswrapper[4735]: I1209 15:20:04.337243 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 15:20:04 crc kubenswrapper[4735]: I1209 15:20:04.338256 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1979881ab15f986f5604820cd914ee70602c888f5970309366c4dd6823a3f20"} pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:20:04 crc kubenswrapper[4735]: I1209 15:20:04.338328 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" containerID="cri-o://e1979881ab15f986f5604820cd914ee70602c888f5970309366c4dd6823a3f20" gracePeriod=600 Dec 09 15:20:05 crc kubenswrapper[4735]: I1209 15:20:05.338722 4735 generic.go:334] "Generic (PLEG): container finished" podID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerID="e1979881ab15f986f5604820cd914ee70602c888f5970309366c4dd6823a3f20" exitCode=0 Dec 09 15:20:05 crc kubenswrapper[4735]: I1209 15:20:05.338921 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerDied","Data":"e1979881ab15f986f5604820cd914ee70602c888f5970309366c4dd6823a3f20"} Dec 09 15:20:05 crc kubenswrapper[4735]: I1209 15:20:05.340127 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a"} Dec 09 15:20:05 crc kubenswrapper[4735]: I1209 15:20:05.340170 4735 scope.go:117] "RemoveContainer" containerID="e3623d885b2b964e98770ce41e4b25f88400ebcd07eb10b6fc140eaa82071c9b" Dec 09 15:21:07 crc kubenswrapper[4735]: E1209 15:21:07.419411 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:21:07 crc kubenswrapper[4735]: E1209 15:21:07.419880 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:21:07 crc kubenswrapper[4735]: E1209 15:21:07.420212 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hxcxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-jzq8f_openstack-operators(9bfb1b1a-87c7-4fa5-ad02-935f53dbc081): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:21:07 crc kubenswrapper[4735]: E1209 15:21:07.422389 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:21:20 crc kubenswrapper[4735]: E1209 15:21:20.415978 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:21:34 crc kubenswrapper[4735]: E1209 15:21:34.415295 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:22:04 crc kubenswrapper[4735]: I1209 15:22:04.336212 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:22:04 crc kubenswrapper[4735]: I1209 15:22:04.336669 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:22:34 crc kubenswrapper[4735]: I1209 15:22:34.335598 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:22:34 crc kubenswrapper[4735]: I1209 15:22:34.336305 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:23:04 crc kubenswrapper[4735]: I1209 15:23:04.335726 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:23:04 crc kubenswrapper[4735]: I1209 15:23:04.336328 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:23:04 crc kubenswrapper[4735]: I1209 15:23:04.336370 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 15:23:04 crc kubenswrapper[4735]: I1209 15:23:04.337171 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a"} pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:23:04 crc kubenswrapper[4735]: I1209 15:23:04.337225 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" containerID="cri-o://27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" gracePeriod=600 Dec 09 15:23:04 crc kubenswrapper[4735]: E1209 15:23:04.459191 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:23:05 crc kubenswrapper[4735]: I1209 15:23:05.324348 4735 generic.go:334] "Generic (PLEG): container finished" podID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" exitCode=0 Dec 09 15:23:05 crc kubenswrapper[4735]: I1209 15:23:05.324387 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerDied","Data":"27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a"} Dec 09 15:23:05 crc kubenswrapper[4735]: I1209 15:23:05.324417 4735 scope.go:117] "RemoveContainer" containerID="e1979881ab15f986f5604820cd914ee70602c888f5970309366c4dd6823a3f20" Dec 09 15:23:05 crc kubenswrapper[4735]: I1209 15:23:05.324902 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:23:05 crc kubenswrapper[4735]: E1209 15:23:05.325152 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:23:19 crc kubenswrapper[4735]: I1209 15:23:19.413634 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:23:19 crc kubenswrapper[4735]: E1209 15:23:19.414790 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:23:31 crc kubenswrapper[4735]: I1209 15:23:31.416807 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:23:31 crc kubenswrapper[4735]: E1209 15:23:31.417655 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:23:39 crc kubenswrapper[4735]: I1209 15:23:39.803397 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p8xc8"] Dec 09 15:23:39 crc kubenswrapper[4735]: E1209 15:23:39.804110 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5225b0ad-49dd-404e-8575-00bc9146cb62" containerName="collect-profiles" Dec 09 15:23:39 crc kubenswrapper[4735]: I1209 15:23:39.804124 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5225b0ad-49dd-404e-8575-00bc9146cb62" containerName="collect-profiles" Dec 09 15:23:39 crc kubenswrapper[4735]: I1209 15:23:39.804234 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5225b0ad-49dd-404e-8575-00bc9146cb62" containerName="collect-profiles" Dec 09 15:23:39 crc kubenswrapper[4735]: I1209 15:23:39.805064 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:39 crc kubenswrapper[4735]: I1209 15:23:39.812087 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p8xc8"] Dec 09 15:23:40 crc kubenswrapper[4735]: I1209 15:23:40.000196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drb46\" (UniqueName: \"kubernetes.io/projected/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-kube-api-access-drb46\") pod \"redhat-operators-p8xc8\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:40 crc kubenswrapper[4735]: I1209 15:23:40.000270 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-catalog-content\") pod \"redhat-operators-p8xc8\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:40 crc kubenswrapper[4735]: I1209 15:23:40.000299 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-utilities\") pod \"redhat-operators-p8xc8\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:40 crc kubenswrapper[4735]: I1209 15:23:40.101581 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drb46\" (UniqueName: \"kubernetes.io/projected/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-kube-api-access-drb46\") pod \"redhat-operators-p8xc8\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:40 crc kubenswrapper[4735]: I1209 15:23:40.101665 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-catalog-content\") pod \"redhat-operators-p8xc8\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:40 crc kubenswrapper[4735]: I1209 15:23:40.101689 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-utilities\") pod \"redhat-operators-p8xc8\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:40 crc kubenswrapper[4735]: I1209 15:23:40.102166 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-catalog-content\") pod \"redhat-operators-p8xc8\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:40 crc kubenswrapper[4735]: I1209 15:23:40.102190 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-utilities\") pod \"redhat-operators-p8xc8\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:40 crc kubenswrapper[4735]: I1209 15:23:40.120696 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drb46\" (UniqueName: \"kubernetes.io/projected/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-kube-api-access-drb46\") pod \"redhat-operators-p8xc8\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:40 crc kubenswrapper[4735]: I1209 15:23:40.124430 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:40 crc kubenswrapper[4735]: I1209 15:23:40.507433 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p8xc8"] Dec 09 15:23:41 crc kubenswrapper[4735]: I1209 15:23:41.517371 4735 generic.go:334] "Generic (PLEG): container finished" podID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" containerID="f3087e7bff515e5c7c8896a4fef0a8a4fc3a3f1bbb21e92ffb5d2f592fea08a2" exitCode=0 Dec 09 15:23:41 crc kubenswrapper[4735]: I1209 15:23:41.517473 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8xc8" event={"ID":"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c","Type":"ContainerDied","Data":"f3087e7bff515e5c7c8896a4fef0a8a4fc3a3f1bbb21e92ffb5d2f592fea08a2"} Dec 09 15:23:41 crc kubenswrapper[4735]: I1209 15:23:41.517724 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8xc8" event={"ID":"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c","Type":"ContainerStarted","Data":"d2e4c19307f42c3e960569f1cd2b198e1446f822e854683842560d35150f032e"} Dec 09 15:23:42 crc kubenswrapper[4735]: I1209 15:23:42.413888 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:23:42 crc kubenswrapper[4735]: E1209 15:23:42.414437 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:23:42 crc kubenswrapper[4735]: I1209 15:23:42.525887 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8xc8" event={"ID":"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c","Type":"ContainerStarted","Data":"3faae225fcd5da44b31319535f6302604c746b4580f0e13a471f4e85d544bfde"} Dec 09 15:23:43 crc kubenswrapper[4735]: I1209 15:23:43.532807 4735 generic.go:334] "Generic (PLEG): container finished" podID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" containerID="3faae225fcd5da44b31319535f6302604c746b4580f0e13a471f4e85d544bfde" exitCode=0 Dec 09 15:23:43 crc kubenswrapper[4735]: I1209 15:23:43.532852 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8xc8" event={"ID":"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c","Type":"ContainerDied","Data":"3faae225fcd5da44b31319535f6302604c746b4580f0e13a471f4e85d544bfde"} Dec 09 15:23:44 crc kubenswrapper[4735]: I1209 15:23:44.541455 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8xc8" event={"ID":"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c","Type":"ContainerStarted","Data":"991ee2826ecedf3556e18d97ec0990ad493123f3f21911b3a2c45ed95e0d575e"} Dec 09 15:23:44 crc kubenswrapper[4735]: I1209 15:23:44.559693 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p8xc8" podStartSLOduration=3.04945587 podStartE2EDuration="5.559676779s" podCreationTimestamp="2025-12-09 15:23:39 +0000 UTC" firstStartedPulling="2025-12-09 15:23:41.518863879 +0000 UTC m=+1500.443702507" lastFinishedPulling="2025-12-09 15:23:44.029084788 +0000 UTC m=+1502.953923416" observedRunningTime="2025-12-09 15:23:44.557728881 +0000 UTC m=+1503.482567509" watchObservedRunningTime="2025-12-09 15:23:44.559676779 +0000 UTC m=+1503.484515406" Dec 09 15:23:48 crc kubenswrapper[4735]: E1209 15:23:48.421246 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:23:48 crc kubenswrapper[4735]: E1209 15:23:48.421651 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:23:48 crc kubenswrapper[4735]: E1209 15:23:48.421767 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hxcxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-jzq8f_openstack-operators(9bfb1b1a-87c7-4fa5-ad02-935f53dbc081): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:23:48 crc kubenswrapper[4735]: E1209 15:23:48.422945 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:23:50 crc kubenswrapper[4735]: I1209 15:23:50.124687 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:50 crc kubenswrapper[4735]: I1209 15:23:50.124739 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:50 crc kubenswrapper[4735]: I1209 15:23:50.160125 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:50 crc kubenswrapper[4735]: I1209 15:23:50.604682 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:50 crc kubenswrapper[4735]: I1209 15:23:50.637759 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p8xc8"] Dec 09 15:23:52 crc kubenswrapper[4735]: I1209 15:23:52.588628 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p8xc8" podUID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" containerName="registry-server" containerID="cri-o://991ee2826ecedf3556e18d97ec0990ad493123f3f21911b3a2c45ed95e0d575e" gracePeriod=2 Dec 09 15:23:53 crc kubenswrapper[4735]: I1209 15:23:53.596008 4735 generic.go:334] "Generic (PLEG): container finished" podID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" containerID="991ee2826ecedf3556e18d97ec0990ad493123f3f21911b3a2c45ed95e0d575e" exitCode=0 Dec 09 15:23:53 crc kubenswrapper[4735]: I1209 15:23:53.596050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8xc8" event={"ID":"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c","Type":"ContainerDied","Data":"991ee2826ecedf3556e18d97ec0990ad493123f3f21911b3a2c45ed95e0d575e"} Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.021133 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.089969 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-catalog-content\") pod \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.090016 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drb46\" (UniqueName: \"kubernetes.io/projected/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-kube-api-access-drb46\") pod \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.090059 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-utilities\") pod \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\" (UID: \"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c\") " Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.090820 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-utilities" (OuterVolumeSpecName: "utilities") pod "73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" (UID: "73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.095121 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-kube-api-access-drb46" (OuterVolumeSpecName: "kube-api-access-drb46") pod "73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" (UID: "73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c"). InnerVolumeSpecName "kube-api-access-drb46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.171151 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" (UID: "73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.192113 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drb46\" (UniqueName: \"kubernetes.io/projected/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-kube-api-access-drb46\") on node \"crc\" DevicePath \"\"" Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.192137 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.192148 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.603169 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8xc8" event={"ID":"73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c","Type":"ContainerDied","Data":"d2e4c19307f42c3e960569f1cd2b198e1446f822e854683842560d35150f032e"} Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.603209 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8xc8" Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.603268 4735 scope.go:117] "RemoveContainer" containerID="991ee2826ecedf3556e18d97ec0990ad493123f3f21911b3a2c45ed95e0d575e" Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.619947 4735 scope.go:117] "RemoveContainer" containerID="3faae225fcd5da44b31319535f6302604c746b4580f0e13a471f4e85d544bfde" Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.628883 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p8xc8"] Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.632705 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p8xc8"] Dec 09 15:23:54 crc kubenswrapper[4735]: I1209 15:23:54.644718 4735 scope.go:117] "RemoveContainer" containerID="f3087e7bff515e5c7c8896a4fef0a8a4fc3a3f1bbb21e92ffb5d2f592fea08a2" Dec 09 15:23:55 crc kubenswrapper[4735]: I1209 15:23:55.422110 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" path="/var/lib/kubelet/pods/73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c/volumes" Dec 09 15:23:57 crc kubenswrapper[4735]: I1209 15:23:57.414336 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:23:57 crc kubenswrapper[4735]: E1209 15:23:57.414562 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:24:01 crc kubenswrapper[4735]: E1209 15:24:01.420749 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.604907 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zzc"] Dec 09 15:24:08 crc kubenswrapper[4735]: E1209 15:24:08.605457 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" containerName="extract-content" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.605469 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" containerName="extract-content" Dec 09 15:24:08 crc kubenswrapper[4735]: E1209 15:24:08.605485 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" containerName="extract-utilities" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.605490 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" containerName="extract-utilities" Dec 09 15:24:08 crc kubenswrapper[4735]: E1209 15:24:08.605499 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" containerName="registry-server" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.605505 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" containerName="registry-server" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.605652 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a96b0a-0b2d-43bb-b0dd-f57e5cdc3a5c" containerName="registry-server" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.606485 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.611831 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zzc"] Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.657566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-utilities\") pod \"redhat-marketplace-h4zzc\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.657668 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-catalog-content\") pod \"redhat-marketplace-h4zzc\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.657696 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psj5d\" (UniqueName: \"kubernetes.io/projected/e41b8ce2-282a-4ef7-9142-577a0320b7c1-kube-api-access-psj5d\") pod \"redhat-marketplace-h4zzc\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.758579 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-utilities\") pod \"redhat-marketplace-h4zzc\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.758655 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-catalog-content\") pod \"redhat-marketplace-h4zzc\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.758679 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psj5d\" (UniqueName: \"kubernetes.io/projected/e41b8ce2-282a-4ef7-9142-577a0320b7c1-kube-api-access-psj5d\") pod \"redhat-marketplace-h4zzc\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.759021 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-utilities\") pod \"redhat-marketplace-h4zzc\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.759188 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-catalog-content\") pod \"redhat-marketplace-h4zzc\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.774741 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psj5d\" (UniqueName: \"kubernetes.io/projected/e41b8ce2-282a-4ef7-9142-577a0320b7c1-kube-api-access-psj5d\") pod \"redhat-marketplace-h4zzc\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:08 crc kubenswrapper[4735]: I1209 15:24:08.921274 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:09 crc kubenswrapper[4735]: I1209 15:24:09.284718 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zzc"] Dec 09 15:24:09 crc kubenswrapper[4735]: W1209 15:24:09.287267 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode41b8ce2_282a_4ef7_9142_577a0320b7c1.slice/crio-0381f92000c6ab34b6f963e4087de783cee43d0991ab421674591d607fe9caa1 WatchSource:0}: Error finding container 0381f92000c6ab34b6f963e4087de783cee43d0991ab421674591d607fe9caa1: Status 404 returned error can't find the container with id 0381f92000c6ab34b6f963e4087de783cee43d0991ab421674591d607fe9caa1 Dec 09 15:24:09 crc kubenswrapper[4735]: I1209 15:24:09.674626 4735 generic.go:334] "Generic (PLEG): container finished" podID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" containerID="02c8b5739aed954a4e3c455899624f75d0810a3fee0931f92f8d4e719178f1b3" exitCode=0 Dec 09 15:24:09 crc kubenswrapper[4735]: I1209 15:24:09.674663 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zzc" event={"ID":"e41b8ce2-282a-4ef7-9142-577a0320b7c1","Type":"ContainerDied","Data":"02c8b5739aed954a4e3c455899624f75d0810a3fee0931f92f8d4e719178f1b3"} Dec 09 15:24:09 crc kubenswrapper[4735]: I1209 15:24:09.674684 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zzc" event={"ID":"e41b8ce2-282a-4ef7-9142-577a0320b7c1","Type":"ContainerStarted","Data":"0381f92000c6ab34b6f963e4087de783cee43d0991ab421674591d607fe9caa1"} Dec 09 15:24:09 crc kubenswrapper[4735]: I1209 15:24:09.676138 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:24:10 crc kubenswrapper[4735]: I1209 15:24:10.413460 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:24:10 crc kubenswrapper[4735]: E1209 15:24:10.413806 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:24:10 crc kubenswrapper[4735]: I1209 15:24:10.681318 4735 generic.go:334] "Generic (PLEG): container finished" podID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" containerID="d6ec7f11571aaa746337e26b1de35cd2786159fa7f1b02f5e94d4bb87bf90df0" exitCode=0 Dec 09 15:24:10 crc kubenswrapper[4735]: I1209 15:24:10.681354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zzc" event={"ID":"e41b8ce2-282a-4ef7-9142-577a0320b7c1","Type":"ContainerDied","Data":"d6ec7f11571aaa746337e26b1de35cd2786159fa7f1b02f5e94d4bb87bf90df0"} Dec 09 15:24:11 crc kubenswrapper[4735]: I1209 15:24:11.688892 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zzc" event={"ID":"e41b8ce2-282a-4ef7-9142-577a0320b7c1","Type":"ContainerStarted","Data":"84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1"} Dec 09 15:24:11 crc kubenswrapper[4735]: I1209 15:24:11.702786 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h4zzc" podStartSLOduration=2.20032436 podStartE2EDuration="3.70277077s" podCreationTimestamp="2025-12-09 15:24:08 +0000 UTC" firstStartedPulling="2025-12-09 15:24:09.675915118 +0000 UTC m=+1528.600753746" lastFinishedPulling="2025-12-09 15:24:11.178361528 +0000 UTC m=+1530.103200156" observedRunningTime="2025-12-09 15:24:11.700943962 +0000 UTC m=+1530.625782590" watchObservedRunningTime="2025-12-09 15:24:11.70277077 +0000 UTC m=+1530.627609399" Dec 09 15:24:14 crc kubenswrapper[4735]: I1209 15:24:14.999139 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qjkln"] Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.000484 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.006295 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qjkln"] Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.130199 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-catalog-content\") pod \"community-operators-qjkln\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.130243 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmnfn\" (UniqueName: \"kubernetes.io/projected/784faae9-1603-40f3-acf7-c828c03774fb-kube-api-access-pmnfn\") pod \"community-operators-qjkln\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.130277 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-utilities\") pod \"community-operators-qjkln\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.231152 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-catalog-content\") pod \"community-operators-qjkln\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.231195 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmnfn\" (UniqueName: \"kubernetes.io/projected/784faae9-1603-40f3-acf7-c828c03774fb-kube-api-access-pmnfn\") pod \"community-operators-qjkln\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.231223 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-utilities\") pod \"community-operators-qjkln\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.231603 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-catalog-content\") pod \"community-operators-qjkln\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.231622 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-utilities\") pod \"community-operators-qjkln\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.250147 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmnfn\" (UniqueName: \"kubernetes.io/projected/784faae9-1603-40f3-acf7-c828c03774fb-kube-api-access-pmnfn\") pod \"community-operators-qjkln\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.314309 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:15 crc kubenswrapper[4735]: E1209 15:24:15.417823 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.673867 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qjkln"] Dec 09 15:24:15 crc kubenswrapper[4735]: I1209 15:24:15.707967 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjkln" event={"ID":"784faae9-1603-40f3-acf7-c828c03774fb","Type":"ContainerStarted","Data":"d079f62d176835d20c1283edeebe11c56873f6b2bb8df296c2ca2b79ed5eb0b6"} Dec 09 15:24:16 crc kubenswrapper[4735]: I1209 15:24:16.714389 4735 generic.go:334] "Generic (PLEG): container finished" podID="784faae9-1603-40f3-acf7-c828c03774fb" containerID="ccdce85a84e38383c689f95494e012ff498eaae66da0dc7209ee1dd821e2831f" exitCode=0 Dec 09 15:24:16 crc kubenswrapper[4735]: I1209 15:24:16.714423 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjkln" event={"ID":"784faae9-1603-40f3-acf7-c828c03774fb","Type":"ContainerDied","Data":"ccdce85a84e38383c689f95494e012ff498eaae66da0dc7209ee1dd821e2831f"} Dec 09 15:24:17 crc kubenswrapper[4735]: I1209 15:24:17.721172 4735 generic.go:334] "Generic (PLEG): container finished" podID="784faae9-1603-40f3-acf7-c828c03774fb" containerID="2a4c40e1ba3df143c6c460ddf2d996fcce2d5b52b1b884da39abcbd5870fa4f5" exitCode=0 Dec 09 15:24:17 crc kubenswrapper[4735]: I1209 15:24:17.721236 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjkln" event={"ID":"784faae9-1603-40f3-acf7-c828c03774fb","Type":"ContainerDied","Data":"2a4c40e1ba3df143c6c460ddf2d996fcce2d5b52b1b884da39abcbd5870fa4f5"} Dec 09 15:24:18 crc kubenswrapper[4735]: I1209 15:24:18.729569 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjkln" event={"ID":"784faae9-1603-40f3-acf7-c828c03774fb","Type":"ContainerStarted","Data":"d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f"} Dec 09 15:24:18 crc kubenswrapper[4735]: I1209 15:24:18.743798 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qjkln" podStartSLOduration=3.302697281 podStartE2EDuration="4.743784693s" podCreationTimestamp="2025-12-09 15:24:14 +0000 UTC" firstStartedPulling="2025-12-09 15:24:16.71546965 +0000 UTC m=+1535.640308278" lastFinishedPulling="2025-12-09 15:24:18.156557062 +0000 UTC m=+1537.081395690" observedRunningTime="2025-12-09 15:24:18.740185966 +0000 UTC m=+1537.665024595" watchObservedRunningTime="2025-12-09 15:24:18.743784693 +0000 UTC m=+1537.668623321" Dec 09 15:24:18 crc kubenswrapper[4735]: I1209 15:24:18.922246 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:18 crc kubenswrapper[4735]: I1209 15:24:18.922311 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:18 crc kubenswrapper[4735]: I1209 15:24:18.954262 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:19 crc kubenswrapper[4735]: I1209 15:24:19.765042 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:21 crc kubenswrapper[4735]: I1209 15:24:21.191481 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zzc"] Dec 09 15:24:21 crc kubenswrapper[4735]: I1209 15:24:21.416748 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:24:21 crc kubenswrapper[4735]: E1209 15:24:21.417302 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:24:21 crc kubenswrapper[4735]: I1209 15:24:21.750078 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h4zzc" podUID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" containerName="registry-server" containerID="cri-o://84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1" gracePeriod=2 Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.079714 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.204224 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-utilities\") pod \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.204384 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-catalog-content\") pod \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.204458 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psj5d\" (UniqueName: \"kubernetes.io/projected/e41b8ce2-282a-4ef7-9142-577a0320b7c1-kube-api-access-psj5d\") pod \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\" (UID: \"e41b8ce2-282a-4ef7-9142-577a0320b7c1\") " Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.204979 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-utilities" (OuterVolumeSpecName: "utilities") pod "e41b8ce2-282a-4ef7-9142-577a0320b7c1" (UID: "e41b8ce2-282a-4ef7-9142-577a0320b7c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.208841 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e41b8ce2-282a-4ef7-9142-577a0320b7c1-kube-api-access-psj5d" (OuterVolumeSpecName: "kube-api-access-psj5d") pod "e41b8ce2-282a-4ef7-9142-577a0320b7c1" (UID: "e41b8ce2-282a-4ef7-9142-577a0320b7c1"). InnerVolumeSpecName "kube-api-access-psj5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.218672 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e41b8ce2-282a-4ef7-9142-577a0320b7c1" (UID: "e41b8ce2-282a-4ef7-9142-577a0320b7c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.305880 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.305905 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psj5d\" (UniqueName: \"kubernetes.io/projected/e41b8ce2-282a-4ef7-9142-577a0320b7c1-kube-api-access-psj5d\") on node \"crc\" DevicePath \"\"" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.305915 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e41b8ce2-282a-4ef7-9142-577a0320b7c1-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.759781 4735 generic.go:334] "Generic (PLEG): container finished" podID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" containerID="84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1" exitCode=0 Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.759841 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zzc" event={"ID":"e41b8ce2-282a-4ef7-9142-577a0320b7c1","Type":"ContainerDied","Data":"84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1"} Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.759879 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h4zzc" event={"ID":"e41b8ce2-282a-4ef7-9142-577a0320b7c1","Type":"ContainerDied","Data":"0381f92000c6ab34b6f963e4087de783cee43d0991ab421674591d607fe9caa1"} Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.759900 4735 scope.go:117] "RemoveContainer" containerID="84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.759906 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h4zzc" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.773843 4735 scope.go:117] "RemoveContainer" containerID="d6ec7f11571aaa746337e26b1de35cd2786159fa7f1b02f5e94d4bb87bf90df0" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.782892 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zzc"] Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.787466 4735 scope.go:117] "RemoveContainer" containerID="02c8b5739aed954a4e3c455899624f75d0810a3fee0931f92f8d4e719178f1b3" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.789825 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h4zzc"] Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.806951 4735 scope.go:117] "RemoveContainer" containerID="84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1" Dec 09 15:24:22 crc kubenswrapper[4735]: E1209 15:24:22.807414 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1\": container with ID starting with 84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1 not found: ID does not exist" containerID="84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.807452 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1"} err="failed to get container status \"84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1\": rpc error: code = NotFound desc = could not find container \"84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1\": container with ID starting with 84c73dfd581c7a1d0fc93efbb8b469e257e87750b0cbe08491d315435a3adea1 not found: ID does not exist" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.807475 4735 scope.go:117] "RemoveContainer" containerID="d6ec7f11571aaa746337e26b1de35cd2786159fa7f1b02f5e94d4bb87bf90df0" Dec 09 15:24:22 crc kubenswrapper[4735]: E1209 15:24:22.807737 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6ec7f11571aaa746337e26b1de35cd2786159fa7f1b02f5e94d4bb87bf90df0\": container with ID starting with d6ec7f11571aaa746337e26b1de35cd2786159fa7f1b02f5e94d4bb87bf90df0 not found: ID does not exist" containerID="d6ec7f11571aaa746337e26b1de35cd2786159fa7f1b02f5e94d4bb87bf90df0" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.807765 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ec7f11571aaa746337e26b1de35cd2786159fa7f1b02f5e94d4bb87bf90df0"} err="failed to get container status \"d6ec7f11571aaa746337e26b1de35cd2786159fa7f1b02f5e94d4bb87bf90df0\": rpc error: code = NotFound desc = could not find container \"d6ec7f11571aaa746337e26b1de35cd2786159fa7f1b02f5e94d4bb87bf90df0\": container with ID starting with d6ec7f11571aaa746337e26b1de35cd2786159fa7f1b02f5e94d4bb87bf90df0 not found: ID does not exist" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.807787 4735 scope.go:117] "RemoveContainer" containerID="02c8b5739aed954a4e3c455899624f75d0810a3fee0931f92f8d4e719178f1b3" Dec 09 15:24:22 crc kubenswrapper[4735]: E1209 15:24:22.808063 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02c8b5739aed954a4e3c455899624f75d0810a3fee0931f92f8d4e719178f1b3\": container with ID starting with 02c8b5739aed954a4e3c455899624f75d0810a3fee0931f92f8d4e719178f1b3 not found: ID does not exist" containerID="02c8b5739aed954a4e3c455899624f75d0810a3fee0931f92f8d4e719178f1b3" Dec 09 15:24:22 crc kubenswrapper[4735]: I1209 15:24:22.808155 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c8b5739aed954a4e3c455899624f75d0810a3fee0931f92f8d4e719178f1b3"} err="failed to get container status \"02c8b5739aed954a4e3c455899624f75d0810a3fee0931f92f8d4e719178f1b3\": rpc error: code = NotFound desc = could not find container \"02c8b5739aed954a4e3c455899624f75d0810a3fee0931f92f8d4e719178f1b3\": container with ID starting with 02c8b5739aed954a4e3c455899624f75d0810a3fee0931f92f8d4e719178f1b3 not found: ID does not exist" Dec 09 15:24:23 crc kubenswrapper[4735]: I1209 15:24:23.420224 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" path="/var/lib/kubelet/pods/e41b8ce2-282a-4ef7-9142-577a0320b7c1/volumes" Dec 09 15:24:25 crc kubenswrapper[4735]: I1209 15:24:25.314987 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:25 crc kubenswrapper[4735]: I1209 15:24:25.315026 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:25 crc kubenswrapper[4735]: I1209 15:24:25.347038 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:25 crc kubenswrapper[4735]: I1209 15:24:25.809501 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:26 crc kubenswrapper[4735]: I1209 15:24:26.793106 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qjkln"] Dec 09 15:24:27 crc kubenswrapper[4735]: I1209 15:24:27.789264 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qjkln" podUID="784faae9-1603-40f3-acf7-c828c03774fb" containerName="registry-server" containerID="cri-o://d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f" gracePeriod=2 Dec 09 15:24:28 crc kubenswrapper[4735]: E1209 15:24:28.414874 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.616225 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.785698 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-catalog-content\") pod \"784faae9-1603-40f3-acf7-c828c03774fb\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.785742 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmnfn\" (UniqueName: \"kubernetes.io/projected/784faae9-1603-40f3-acf7-c828c03774fb-kube-api-access-pmnfn\") pod \"784faae9-1603-40f3-acf7-c828c03774fb\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.785899 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-utilities\") pod \"784faae9-1603-40f3-acf7-c828c03774fb\" (UID: \"784faae9-1603-40f3-acf7-c828c03774fb\") " Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.786548 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-utilities" (OuterVolumeSpecName: "utilities") pod "784faae9-1603-40f3-acf7-c828c03774fb" (UID: "784faae9-1603-40f3-acf7-c828c03774fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.794348 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784faae9-1603-40f3-acf7-c828c03774fb-kube-api-access-pmnfn" (OuterVolumeSpecName: "kube-api-access-pmnfn") pod "784faae9-1603-40f3-acf7-c828c03774fb" (UID: "784faae9-1603-40f3-acf7-c828c03774fb"). InnerVolumeSpecName "kube-api-access-pmnfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.801269 4735 generic.go:334] "Generic (PLEG): container finished" podID="784faae9-1603-40f3-acf7-c828c03774fb" containerID="d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f" exitCode=0 Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.801369 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjkln" event={"ID":"784faae9-1603-40f3-acf7-c828c03774fb","Type":"ContainerDied","Data":"d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f"} Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.801348 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjkln" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.801405 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjkln" event={"ID":"784faae9-1603-40f3-acf7-c828c03774fb","Type":"ContainerDied","Data":"d079f62d176835d20c1283edeebe11c56873f6b2bb8df296c2ca2b79ed5eb0b6"} Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.801424 4735 scope.go:117] "RemoveContainer" containerID="d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.818374 4735 scope.go:117] "RemoveContainer" containerID="2a4c40e1ba3df143c6c460ddf2d996fcce2d5b52b1b884da39abcbd5870fa4f5" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.825772 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "784faae9-1603-40f3-acf7-c828c03774fb" (UID: "784faae9-1603-40f3-acf7-c828c03774fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.838871 4735 scope.go:117] "RemoveContainer" containerID="ccdce85a84e38383c689f95494e012ff498eaae66da0dc7209ee1dd821e2831f" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.856714 4735 scope.go:117] "RemoveContainer" containerID="d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f" Dec 09 15:24:28 crc kubenswrapper[4735]: E1209 15:24:28.857030 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f\": container with ID starting with d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f not found: ID does not exist" containerID="d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.857068 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f"} err="failed to get container status \"d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f\": rpc error: code = NotFound desc = could not find container \"d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f\": container with ID starting with d62b7f70dac240254a98e8f50728c535078fe5f0d7437a14573612a8a007104f not found: ID does not exist" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.857095 4735 scope.go:117] "RemoveContainer" containerID="2a4c40e1ba3df143c6c460ddf2d996fcce2d5b52b1b884da39abcbd5870fa4f5" Dec 09 15:24:28 crc kubenswrapper[4735]: E1209 15:24:28.857302 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a4c40e1ba3df143c6c460ddf2d996fcce2d5b52b1b884da39abcbd5870fa4f5\": container with ID starting with 2a4c40e1ba3df143c6c460ddf2d996fcce2d5b52b1b884da39abcbd5870fa4f5 not found: ID does not exist" containerID="2a4c40e1ba3df143c6c460ddf2d996fcce2d5b52b1b884da39abcbd5870fa4f5" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.857323 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a4c40e1ba3df143c6c460ddf2d996fcce2d5b52b1b884da39abcbd5870fa4f5"} err="failed to get container status \"2a4c40e1ba3df143c6c460ddf2d996fcce2d5b52b1b884da39abcbd5870fa4f5\": rpc error: code = NotFound desc = could not find container \"2a4c40e1ba3df143c6c460ddf2d996fcce2d5b52b1b884da39abcbd5870fa4f5\": container with ID starting with 2a4c40e1ba3df143c6c460ddf2d996fcce2d5b52b1b884da39abcbd5870fa4f5 not found: ID does not exist" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.857335 4735 scope.go:117] "RemoveContainer" containerID="ccdce85a84e38383c689f95494e012ff498eaae66da0dc7209ee1dd821e2831f" Dec 09 15:24:28 crc kubenswrapper[4735]: E1209 15:24:28.857551 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccdce85a84e38383c689f95494e012ff498eaae66da0dc7209ee1dd821e2831f\": container with ID starting with ccdce85a84e38383c689f95494e012ff498eaae66da0dc7209ee1dd821e2831f not found: ID does not exist" containerID="ccdce85a84e38383c689f95494e012ff498eaae66da0dc7209ee1dd821e2831f" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.857577 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccdce85a84e38383c689f95494e012ff498eaae66da0dc7209ee1dd821e2831f"} err="failed to get container status \"ccdce85a84e38383c689f95494e012ff498eaae66da0dc7209ee1dd821e2831f\": rpc error: code = NotFound desc = could not find container \"ccdce85a84e38383c689f95494e012ff498eaae66da0dc7209ee1dd821e2831f\": container with ID starting with ccdce85a84e38383c689f95494e012ff498eaae66da0dc7209ee1dd821e2831f not found: ID does not exist" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.888501 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.888574 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/784faae9-1603-40f3-acf7-c828c03774fb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:24:28 crc kubenswrapper[4735]: I1209 15:24:28.888596 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmnfn\" (UniqueName: \"kubernetes.io/projected/784faae9-1603-40f3-acf7-c828c03774fb-kube-api-access-pmnfn\") on node \"crc\" DevicePath \"\"" Dec 09 15:24:29 crc kubenswrapper[4735]: I1209 15:24:29.126206 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qjkln"] Dec 09 15:24:29 crc kubenswrapper[4735]: I1209 15:24:29.129752 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qjkln"] Dec 09 15:24:29 crc kubenswrapper[4735]: I1209 15:24:29.420626 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784faae9-1603-40f3-acf7-c828c03774fb" path="/var/lib/kubelet/pods/784faae9-1603-40f3-acf7-c828c03774fb/volumes" Dec 09 15:24:35 crc kubenswrapper[4735]: I1209 15:24:35.414174 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:24:35 crc kubenswrapper[4735]: E1209 15:24:35.414850 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:24:42 crc kubenswrapper[4735]: E1209 15:24:42.415263 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:24:49 crc kubenswrapper[4735]: I1209 15:24:49.414159 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:24:49 crc kubenswrapper[4735]: E1209 15:24:49.414742 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:24:57 crc kubenswrapper[4735]: E1209 15:24:57.415944 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:25:03 crc kubenswrapper[4735]: I1209 15:25:03.413709 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:25:03 crc kubenswrapper[4735]: E1209 15:25:03.414293 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:25:15 crc kubenswrapper[4735]: I1209 15:25:15.413737 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:25:15 crc kubenswrapper[4735]: E1209 15:25:15.414268 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:25:30 crc kubenswrapper[4735]: I1209 15:25:30.414127 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:25:30 crc kubenswrapper[4735]: E1209 15:25:30.414596 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:25:43 crc kubenswrapper[4735]: I1209 15:25:43.414814 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:25:43 crc kubenswrapper[4735]: E1209 15:25:43.415435 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:25:55 crc kubenswrapper[4735]: I1209 15:25:55.413928 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:25:55 crc kubenswrapper[4735]: E1209 15:25:55.414756 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:26:07 crc kubenswrapper[4735]: I1209 15:26:07.413590 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:26:07 crc kubenswrapper[4735]: E1209 15:26:07.414114 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:26:22 crc kubenswrapper[4735]: I1209 15:26:22.413906 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:26:22 crc kubenswrapper[4735]: E1209 15:26:22.414405 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:26:36 crc kubenswrapper[4735]: I1209 15:26:36.414190 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:26:36 crc kubenswrapper[4735]: E1209 15:26:36.414844 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:26:49 crc kubenswrapper[4735]: I1209 15:26:49.413945 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:26:49 crc kubenswrapper[4735]: E1209 15:26:49.414775 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:27:03 crc kubenswrapper[4735]: I1209 15:27:03.414456 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:27:03 crc kubenswrapper[4735]: E1209 15:27:03.415242 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:27:12 crc kubenswrapper[4735]: E1209 15:27:12.419858 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:27:12 crc kubenswrapper[4735]: E1209 15:27:12.420230 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:27:12 crc kubenswrapper[4735]: E1209 15:27:12.420384 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hxcxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-jzq8f_openstack-operators(9bfb1b1a-87c7-4fa5-ad02-935f53dbc081): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:27:12 crc kubenswrapper[4735]: E1209 15:27:12.421600 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:27:17 crc kubenswrapper[4735]: I1209 15:27:17.415810 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:27:17 crc kubenswrapper[4735]: E1209 15:27:17.416928 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:27:26 crc kubenswrapper[4735]: E1209 15:27:26.415574 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:27:28 crc kubenswrapper[4735]: I1209 15:27:28.413433 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:27:28 crc kubenswrapper[4735]: E1209 15:27:28.413936 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:27:39 crc kubenswrapper[4735]: E1209 15:27:39.415373 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:27:41 crc kubenswrapper[4735]: I1209 15:27:41.416805 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:27:41 crc kubenswrapper[4735]: E1209 15:27:41.417203 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:27:53 crc kubenswrapper[4735]: E1209 15:27:53.417055 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:27:56 crc kubenswrapper[4735]: I1209 15:27:56.413466 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:27:56 crc kubenswrapper[4735]: E1209 15:27:56.413931 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:28:05 crc kubenswrapper[4735]: E1209 15:28:05.416892 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:28:11 crc kubenswrapper[4735]: I1209 15:28:11.416088 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:28:11 crc kubenswrapper[4735]: I1209 15:28:11.945956 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"51d57515a98a21107784fffd63ae292e17e7ff456c3c70324773d956bec8552f"} Dec 09 15:28:19 crc kubenswrapper[4735]: E1209 15:28:19.415968 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:28:31 crc kubenswrapper[4735]: E1209 15:28:31.417422 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:28:44 crc kubenswrapper[4735]: E1209 15:28:44.415623 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:28:56 crc kubenswrapper[4735]: E1209 15:28:56.415818 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:29:09 crc kubenswrapper[4735]: E1209 15:29:09.415363 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:29:22 crc kubenswrapper[4735]: E1209 15:29:22.415364 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:29:35 crc kubenswrapper[4735]: E1209 15:29:35.415620 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:29:50 crc kubenswrapper[4735]: E1209 15:29:50.414769 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.123887 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn"] Dec 09 15:30:00 crc kubenswrapper[4735]: E1209 15:30:00.124277 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" containerName="extract-content" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.124290 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" containerName="extract-content" Dec 09 15:30:00 crc kubenswrapper[4735]: E1209 15:30:00.124301 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" containerName="extract-utilities" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.124307 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" containerName="extract-utilities" Dec 09 15:30:00 crc kubenswrapper[4735]: E1209 15:30:00.124316 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784faae9-1603-40f3-acf7-c828c03774fb" containerName="extract-utilities" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.124321 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="784faae9-1603-40f3-acf7-c828c03774fb" containerName="extract-utilities" Dec 09 15:30:00 crc kubenswrapper[4735]: E1209 15:30:00.124331 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784faae9-1603-40f3-acf7-c828c03774fb" containerName="registry-server" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.124338 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="784faae9-1603-40f3-acf7-c828c03774fb" containerName="registry-server" Dec 09 15:30:00 crc kubenswrapper[4735]: E1209 15:30:00.124346 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784faae9-1603-40f3-acf7-c828c03774fb" containerName="extract-content" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.124351 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="784faae9-1603-40f3-acf7-c828c03774fb" containerName="extract-content" Dec 09 15:30:00 crc kubenswrapper[4735]: E1209 15:30:00.124365 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" containerName="registry-server" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.124370 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" containerName="registry-server" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.124472 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="784faae9-1603-40f3-acf7-c828c03774fb" containerName="registry-server" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.124488 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41b8ce2-282a-4ef7-9142-577a0320b7c1" containerName="registry-server" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.124909 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.126605 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.126815 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.129747 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn"] Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.199452 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj66b\" (UniqueName: \"kubernetes.io/projected/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-kube-api-access-nj66b\") pod \"collect-profiles-29421570-95kkn\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.199500 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-config-volume\") pod \"collect-profiles-29421570-95kkn\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.199527 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-secret-volume\") pod \"collect-profiles-29421570-95kkn\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.300458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj66b\" (UniqueName: \"kubernetes.io/projected/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-kube-api-access-nj66b\") pod \"collect-profiles-29421570-95kkn\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.300542 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-config-volume\") pod \"collect-profiles-29421570-95kkn\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.300564 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-secret-volume\") pod \"collect-profiles-29421570-95kkn\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.301256 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-config-volume\") pod \"collect-profiles-29421570-95kkn\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.305104 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-secret-volume\") pod \"collect-profiles-29421570-95kkn\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.312640 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj66b\" (UniqueName: \"kubernetes.io/projected/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-kube-api-access-nj66b\") pod \"collect-profiles-29421570-95kkn\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.438791 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:00 crc kubenswrapper[4735]: I1209 15:30:00.780664 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn"] Dec 09 15:30:00 crc kubenswrapper[4735]: W1209 15:30:00.783654 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9a0c06c_bab8_4f4c_a19b_63a6dc41bc55.slice/crio-b0d517e8f8a8aa2bcdb84be16ac505b4b7b4eab384b2f1b288c05bbe19489cbd WatchSource:0}: Error finding container b0d517e8f8a8aa2bcdb84be16ac505b4b7b4eab384b2f1b288c05bbe19489cbd: Status 404 returned error can't find the container with id b0d517e8f8a8aa2bcdb84be16ac505b4b7b4eab384b2f1b288c05bbe19489cbd Dec 09 15:30:01 crc kubenswrapper[4735]: I1209 15:30:01.487256 4735 generic.go:334] "Generic (PLEG): container finished" podID="e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55" containerID="cdc734bd7b9e5f445451e84c9cb24aa137c9a78f997ca315c493b98be5deccd4" exitCode=0 Dec 09 15:30:01 crc kubenswrapper[4735]: I1209 15:30:01.487350 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" event={"ID":"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55","Type":"ContainerDied","Data":"cdc734bd7b9e5f445451e84c9cb24aa137c9a78f997ca315c493b98be5deccd4"} Dec 09 15:30:01 crc kubenswrapper[4735]: I1209 15:30:01.487428 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" event={"ID":"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55","Type":"ContainerStarted","Data":"b0d517e8f8a8aa2bcdb84be16ac505b4b7b4eab384b2f1b288c05bbe19489cbd"} Dec 09 15:30:02 crc kubenswrapper[4735]: I1209 15:30:02.694343 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:02 crc kubenswrapper[4735]: I1209 15:30:02.725494 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-config-volume\") pod \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " Dec 09 15:30:02 crc kubenswrapper[4735]: I1209 15:30:02.725571 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-secret-volume\") pod \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " Dec 09 15:30:02 crc kubenswrapper[4735]: I1209 15:30:02.725626 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj66b\" (UniqueName: \"kubernetes.io/projected/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-kube-api-access-nj66b\") pod \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\" (UID: \"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55\") " Dec 09 15:30:02 crc kubenswrapper[4735]: I1209 15:30:02.726079 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-config-volume" (OuterVolumeSpecName: "config-volume") pod "e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55" (UID: "e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 09 15:30:02 crc kubenswrapper[4735]: I1209 15:30:02.729804 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-kube-api-access-nj66b" (OuterVolumeSpecName: "kube-api-access-nj66b") pod "e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55" (UID: "e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55"). InnerVolumeSpecName "kube-api-access-nj66b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:30:02 crc kubenswrapper[4735]: I1209 15:30:02.729896 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55" (UID: "e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 09 15:30:02 crc kubenswrapper[4735]: I1209 15:30:02.827499 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:30:02 crc kubenswrapper[4735]: I1209 15:30:02.827751 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj66b\" (UniqueName: \"kubernetes.io/projected/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-kube-api-access-nj66b\") on node \"crc\" DevicePath \"\"" Dec 09 15:30:02 crc kubenswrapper[4735]: I1209 15:30:02.827783 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55-config-volume\") on node \"crc\" DevicePath \"\"" Dec 09 15:30:03 crc kubenswrapper[4735]: I1209 15:30:03.497070 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" event={"ID":"e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55","Type":"ContainerDied","Data":"b0d517e8f8a8aa2bcdb84be16ac505b4b7b4eab384b2f1b288c05bbe19489cbd"} Dec 09 15:30:03 crc kubenswrapper[4735]: I1209 15:30:03.497102 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0d517e8f8a8aa2bcdb84be16ac505b4b7b4eab384b2f1b288c05bbe19489cbd" Dec 09 15:30:03 crc kubenswrapper[4735]: I1209 15:30:03.497106 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29421570-95kkn" Dec 09 15:30:05 crc kubenswrapper[4735]: I1209 15:30:05.415868 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.595672 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xlskw/must-gather-262dd"] Dec 09 15:30:06 crc kubenswrapper[4735]: E1209 15:30:06.595917 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55" containerName="collect-profiles" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.595929 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55" containerName="collect-profiles" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.596042 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9a0c06c-bab8-4f4c-a19b-63a6dc41bc55" containerName="collect-profiles" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.596649 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xlskw/must-gather-262dd" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.599072 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xlskw"/"default-dockercfg-s78vb" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.599113 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xlskw"/"kube-root-ca.crt" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.599286 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xlskw"/"openshift-service-ca.crt" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.607363 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xlskw/must-gather-262dd"] Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.667502 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6czs\" (UniqueName: \"kubernetes.io/projected/dab0fe89-76a6-438b-ab5d-edb1f667e091-kube-api-access-w6czs\") pod \"must-gather-262dd\" (UID: \"dab0fe89-76a6-438b-ab5d-edb1f667e091\") " pod="openshift-must-gather-xlskw/must-gather-262dd" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.667594 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe89-76a6-438b-ab5d-edb1f667e091-must-gather-output\") pod \"must-gather-262dd\" (UID: \"dab0fe89-76a6-438b-ab5d-edb1f667e091\") " pod="openshift-must-gather-xlskw/must-gather-262dd" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.768189 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe89-76a6-438b-ab5d-edb1f667e091-must-gather-output\") pod \"must-gather-262dd\" (UID: \"dab0fe89-76a6-438b-ab5d-edb1f667e091\") " pod="openshift-must-gather-xlskw/must-gather-262dd" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.768273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6czs\" (UniqueName: \"kubernetes.io/projected/dab0fe89-76a6-438b-ab5d-edb1f667e091-kube-api-access-w6czs\") pod \"must-gather-262dd\" (UID: \"dab0fe89-76a6-438b-ab5d-edb1f667e091\") " pod="openshift-must-gather-xlskw/must-gather-262dd" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.768824 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe89-76a6-438b-ab5d-edb1f667e091-must-gather-output\") pod \"must-gather-262dd\" (UID: \"dab0fe89-76a6-438b-ab5d-edb1f667e091\") " pod="openshift-must-gather-xlskw/must-gather-262dd" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.799952 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6czs\" (UniqueName: \"kubernetes.io/projected/dab0fe89-76a6-438b-ab5d-edb1f667e091-kube-api-access-w6czs\") pod \"must-gather-262dd\" (UID: \"dab0fe89-76a6-438b-ab5d-edb1f667e091\") " pod="openshift-must-gather-xlskw/must-gather-262dd" Dec 09 15:30:06 crc kubenswrapper[4735]: I1209 15:30:06.909448 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xlskw/must-gather-262dd" Dec 09 15:30:07 crc kubenswrapper[4735]: I1209 15:30:07.261401 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xlskw/must-gather-262dd"] Dec 09 15:30:07 crc kubenswrapper[4735]: I1209 15:30:07.516980 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xlskw/must-gather-262dd" event={"ID":"dab0fe89-76a6-438b-ab5d-edb1f667e091","Type":"ContainerStarted","Data":"f3eced11bf1682b4899d2bd44b4e214002f3b9c988acc0069d58e2811d5ec25b"} Dec 09 15:30:12 crc kubenswrapper[4735]: I1209 15:30:12.557493 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xlskw/must-gather-262dd" event={"ID":"dab0fe89-76a6-438b-ab5d-edb1f667e091","Type":"ContainerStarted","Data":"d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5"} Dec 09 15:30:12 crc kubenswrapper[4735]: I1209 15:30:12.558582 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xlskw/must-gather-262dd" event={"ID":"dab0fe89-76a6-438b-ab5d-edb1f667e091","Type":"ContainerStarted","Data":"81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4"} Dec 09 15:30:12 crc kubenswrapper[4735]: I1209 15:30:12.570840 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xlskw/must-gather-262dd" podStartSLOduration=1.777272103 podStartE2EDuration="6.57083009s" podCreationTimestamp="2025-12-09 15:30:06 +0000 UTC" firstStartedPulling="2025-12-09 15:30:07.267064647 +0000 UTC m=+1886.191903275" lastFinishedPulling="2025-12-09 15:30:12.060622634 +0000 UTC m=+1890.985461262" observedRunningTime="2025-12-09 15:30:12.569579188 +0000 UTC m=+1891.494417817" watchObservedRunningTime="2025-12-09 15:30:12.57083009 +0000 UTC m=+1891.495668717" Dec 09 15:30:34 crc kubenswrapper[4735]: I1209 15:30:34.336445 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:30:34 crc kubenswrapper[4735]: I1209 15:30:34.337238 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:30:42 crc kubenswrapper[4735]: I1209 15:30:42.011600 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-w8m5z_381225e4-030b-401b-a1c6-8926f3a806b7/control-plane-machine-set-operator/0.log" Dec 09 15:30:42 crc kubenswrapper[4735]: I1209 15:30:42.089904 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r4hp5_3a186d2c-8f55-4033-9154-b4ff929c9a98/kube-rbac-proxy/0.log" Dec 09 15:30:42 crc kubenswrapper[4735]: I1209 15:30:42.141224 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r4hp5_3a186d2c-8f55-4033-9154-b4ff929c9a98/machine-api-operator/0.log" Dec 09 15:30:52 crc kubenswrapper[4735]: I1209 15:30:52.239877 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-j9rnk_835eedbd-5d0a-4837-997e-53d608904958/cert-manager-controller/0.log" Dec 09 15:30:52 crc kubenswrapper[4735]: I1209 15:30:52.356318 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-mnkbr_633cc636-711f-4c6f-9b1e-a8ed2b60f487/cert-manager-cainjector/0.log" Dec 09 15:30:52 crc kubenswrapper[4735]: I1209 15:30:52.380679 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-sxb9d_fb15867f-8803-4aa2-b592-5d6267f53c4f/cert-manager-webhook/0.log" Dec 09 15:31:01 crc kubenswrapper[4735]: I1209 15:31:01.713618 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-bhbpx_8b8e4e56-8472-4c79-8885-7e4b5cdb182f/nmstate-console-plugin/0.log" Dec 09 15:31:01 crc kubenswrapper[4735]: I1209 15:31:01.833713 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-gkqpg_710b5a96-ea7b-4fef-9a9a-e96f8d055709/nmstate-handler/0.log" Dec 09 15:31:01 crc kubenswrapper[4735]: I1209 15:31:01.850964 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-bc572_ea111a0c-0995-4059-bd26-5d8b93e9f40c/nmstate-metrics/0.log" Dec 09 15:31:01 crc kubenswrapper[4735]: I1209 15:31:01.903345 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-bc572_ea111a0c-0995-4059-bd26-5d8b93e9f40c/kube-rbac-proxy/0.log" Dec 09 15:31:02 crc kubenswrapper[4735]: I1209 15:31:02.018629 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-5wslg_3f70f38a-2464-434f-b617-e93a1ad19c15/nmstate-operator/0.log" Dec 09 15:31:02 crc kubenswrapper[4735]: I1209 15:31:02.081765 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-jrtqv_eb588389-ddd7-4e1a-b514-16bb44dfb935/nmstate-webhook/0.log" Dec 09 15:31:04 crc kubenswrapper[4735]: I1209 15:31:04.335721 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:31:04 crc kubenswrapper[4735]: I1209 15:31:04.336066 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:31:11 crc kubenswrapper[4735]: I1209 15:31:11.835916 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-ff768b6c6-qngzh_393ef666-9d18-4435-bbc9-76acb5636ce7/manager/0.log" Dec 09 15:31:11 crc kubenswrapper[4735]: I1209 15:31:11.849685 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-ff768b6c6-qngzh_393ef666-9d18-4435-bbc9-76acb5636ce7/kube-rbac-proxy/0.log" Dec 09 15:31:21 crc kubenswrapper[4735]: I1209 15:31:21.831410 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-969jl"] Dec 09 15:31:21 crc kubenswrapper[4735]: I1209 15:31:21.832890 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:21 crc kubenswrapper[4735]: I1209 15:31:21.841056 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-969jl"] Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.002143 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k7pz\" (UniqueName: \"kubernetes.io/projected/5356ea20-dbb0-447f-b51e-679a5031dc51-kube-api-access-8k7pz\") pod \"certified-operators-969jl\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.002264 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-utilities\") pod \"certified-operators-969jl\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.002367 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-catalog-content\") pod \"certified-operators-969jl\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.103733 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-catalog-content\") pod \"certified-operators-969jl\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.103869 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k7pz\" (UniqueName: \"kubernetes.io/projected/5356ea20-dbb0-447f-b51e-679a5031dc51-kube-api-access-8k7pz\") pod \"certified-operators-969jl\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.103917 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-utilities\") pod \"certified-operators-969jl\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.104202 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-catalog-content\") pod \"certified-operators-969jl\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.104269 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-utilities\") pod \"certified-operators-969jl\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.120543 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k7pz\" (UniqueName: \"kubernetes.io/projected/5356ea20-dbb0-447f-b51e-679a5031dc51-kube-api-access-8k7pz\") pod \"certified-operators-969jl\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.147259 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.552749 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-969jl"] Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.973543 4735 generic.go:334] "Generic (PLEG): container finished" podID="5356ea20-dbb0-447f-b51e-679a5031dc51" containerID="b4b0a4f8837292b99202e3b4a4176228da15ee0707bc3d4e16b4f6bc02869ffb" exitCode=0 Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.973633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-969jl" event={"ID":"5356ea20-dbb0-447f-b51e-679a5031dc51","Type":"ContainerDied","Data":"b4b0a4f8837292b99202e3b4a4176228da15ee0707bc3d4e16b4f6bc02869ffb"} Dec 09 15:31:22 crc kubenswrapper[4735]: I1209 15:31:22.973843 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-969jl" event={"ID":"5356ea20-dbb0-447f-b51e-679a5031dc51","Type":"ContainerStarted","Data":"070d6ace3d7ba295379087674f20ef0afaa2fc798e9298de06d9a177c90d1fb8"} Dec 09 15:31:23 crc kubenswrapper[4735]: I1209 15:31:23.705588 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-gvl4x_1fe25fc9-0b02-46c3-8654-1db38cfefabc/cluster-logging-operator/0.log" Dec 09 15:31:23 crc kubenswrapper[4735]: I1209 15:31:23.833600 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-zw7mb_d504cda5-e7a2-45e3-b775-73b06f492def/collector/0.log" Dec 09 15:31:23 crc kubenswrapper[4735]: I1209 15:31:23.870347 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_a771f4c8-0d04-4c79-be0d-5b45b4b5a037/loki-compactor/0.log" Dec 09 15:31:23 crc kubenswrapper[4735]: I1209 15:31:23.979803 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-969jl" event={"ID":"5356ea20-dbb0-447f-b51e-679a5031dc51","Type":"ContainerStarted","Data":"2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4"} Dec 09 15:31:24 crc kubenswrapper[4735]: I1209 15:31:24.023263 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-rqpx8_80688e3b-d1ff-4a5b-b7e3-f516ad7a4c41/loki-distributor/0.log" Dec 09 15:31:24 crc kubenswrapper[4735]: I1209 15:31:24.064256 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-55c764d7cd-dftlk_8c1d40e2-618c-45c5-bee9-c620c977a7a5/opa/0.log" Dec 09 15:31:24 crc kubenswrapper[4735]: I1209 15:31:24.064853 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-55c764d7cd-dftlk_8c1d40e2-618c-45c5-bee9-c620c977a7a5/gateway/0.log" Dec 09 15:31:24 crc kubenswrapper[4735]: I1209 15:31:24.306311 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-55c764d7cd-lvwlq_7d3963f2-7997-4cd6-8d9f-accd69aac83b/gateway/0.log" Dec 09 15:31:24 crc kubenswrapper[4735]: I1209 15:31:24.324826 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-55c764d7cd-lvwlq_7d3963f2-7997-4cd6-8d9f-accd69aac83b/opa/0.log" Dec 09 15:31:24 crc kubenswrapper[4735]: I1209 15:31:24.440150 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_dacf0d91-16bc-4578-b424-f4524b47d537/loki-index-gateway/0.log" Dec 09 15:31:24 crc kubenswrapper[4735]: I1209 15:31:24.475654 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_c09b376f-0220-475e-8fed-5219c7e7a147/loki-ingester/0.log" Dec 09 15:31:24 crc kubenswrapper[4735]: I1209 15:31:24.609006 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-7flxb_a9909b2d-93ab-4b43-b415-7d52ac5031e4/loki-query-frontend/0.log" Dec 09 15:31:24 crc kubenswrapper[4735]: I1209 15:31:24.610281 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-g7jxl_b0d84c01-0813-4862-add0-1086e3ca895f/loki-querier/0.log" Dec 09 15:31:24 crc kubenswrapper[4735]: I1209 15:31:24.987176 4735 generic.go:334] "Generic (PLEG): container finished" podID="5356ea20-dbb0-447f-b51e-679a5031dc51" containerID="2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4" exitCode=0 Dec 09 15:31:24 crc kubenswrapper[4735]: I1209 15:31:24.987214 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-969jl" event={"ID":"5356ea20-dbb0-447f-b51e-679a5031dc51","Type":"ContainerDied","Data":"2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4"} Dec 09 15:31:25 crc kubenswrapper[4735]: I1209 15:31:25.994258 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-969jl" event={"ID":"5356ea20-dbb0-447f-b51e-679a5031dc51","Type":"ContainerStarted","Data":"e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685"} Dec 09 15:31:26 crc kubenswrapper[4735]: I1209 15:31:26.012250 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-969jl" podStartSLOduration=2.5041265839999998 podStartE2EDuration="5.012229909s" podCreationTimestamp="2025-12-09 15:31:21 +0000 UTC" firstStartedPulling="2025-12-09 15:31:22.975025234 +0000 UTC m=+1961.899863862" lastFinishedPulling="2025-12-09 15:31:25.483128559 +0000 UTC m=+1964.407967187" observedRunningTime="2025-12-09 15:31:26.011018873 +0000 UTC m=+1964.935857501" watchObservedRunningTime="2025-12-09 15:31:26.012229909 +0000 UTC m=+1964.937068537" Dec 09 15:31:32 crc kubenswrapper[4735]: I1209 15:31:32.147947 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:32 crc kubenswrapper[4735]: I1209 15:31:32.148329 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:32 crc kubenswrapper[4735]: I1209 15:31:32.180875 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:33 crc kubenswrapper[4735]: I1209 15:31:33.064402 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:33 crc kubenswrapper[4735]: I1209 15:31:33.102802 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-969jl"] Dec 09 15:31:34 crc kubenswrapper[4735]: I1209 15:31:34.336026 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:31:34 crc kubenswrapper[4735]: I1209 15:31:34.336335 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:31:34 crc kubenswrapper[4735]: I1209 15:31:34.336384 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 15:31:34 crc kubenswrapper[4735]: I1209 15:31:34.337016 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51d57515a98a21107784fffd63ae292e17e7ff456c3c70324773d956bec8552f"} pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:31:34 crc kubenswrapper[4735]: I1209 15:31:34.337069 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" containerID="cri-o://51d57515a98a21107784fffd63ae292e17e7ff456c3c70324773d956bec8552f" gracePeriod=600 Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.046343 4735 generic.go:334] "Generic (PLEG): container finished" podID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerID="51d57515a98a21107784fffd63ae292e17e7ff456c3c70324773d956bec8552f" exitCode=0 Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.046417 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerDied","Data":"51d57515a98a21107784fffd63ae292e17e7ff456c3c70324773d956bec8552f"} Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.046678 4735 scope.go:117] "RemoveContainer" containerID="27ec292bec482089dad4953feee2067106e2818b7be8daa3bb9c0037f58eb17a" Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.046837 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-969jl" podUID="5356ea20-dbb0-447f-b51e-679a5031dc51" containerName="registry-server" containerID="cri-o://e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685" gracePeriod=2 Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.446106 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.636214 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-utilities\") pod \"5356ea20-dbb0-447f-b51e-679a5031dc51\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.636403 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k7pz\" (UniqueName: \"kubernetes.io/projected/5356ea20-dbb0-447f-b51e-679a5031dc51-kube-api-access-8k7pz\") pod \"5356ea20-dbb0-447f-b51e-679a5031dc51\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.636454 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-catalog-content\") pod \"5356ea20-dbb0-447f-b51e-679a5031dc51\" (UID: \"5356ea20-dbb0-447f-b51e-679a5031dc51\") " Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.637080 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-utilities" (OuterVolumeSpecName: "utilities") pod "5356ea20-dbb0-447f-b51e-679a5031dc51" (UID: "5356ea20-dbb0-447f-b51e-679a5031dc51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.643668 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5356ea20-dbb0-447f-b51e-679a5031dc51-kube-api-access-8k7pz" (OuterVolumeSpecName: "kube-api-access-8k7pz") pod "5356ea20-dbb0-447f-b51e-679a5031dc51" (UID: "5356ea20-dbb0-447f-b51e-679a5031dc51"). InnerVolumeSpecName "kube-api-access-8k7pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.674579 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5356ea20-dbb0-447f-b51e-679a5031dc51" (UID: "5356ea20-dbb0-447f-b51e-679a5031dc51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.737848 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.737881 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k7pz\" (UniqueName: \"kubernetes.io/projected/5356ea20-dbb0-447f-b51e-679a5031dc51-kube-api-access-8k7pz\") on node \"crc\" DevicePath \"\"" Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.737893 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5356ea20-dbb0-447f-b51e-679a5031dc51-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.795072 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-s87j2_b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c/controller/0.log" Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.812912 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-s87j2_b1d2fdb7-18ab-4723-aa7f-1f1b4e2d527c/kube-rbac-proxy/0.log" Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.898091 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-v8jpp_5a837ca7-0878-4803-b74a-ffdde5633d0b/frr-k8s-webhook-server/0.log" Dec 09 15:31:35 crc kubenswrapper[4735]: I1209 15:31:35.958088 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-frr-files/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.054006 4735 generic.go:334] "Generic (PLEG): container finished" podID="5356ea20-dbb0-447f-b51e-679a5031dc51" containerID="e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685" exitCode=0 Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.054089 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-969jl" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.054101 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-969jl" event={"ID":"5356ea20-dbb0-447f-b51e-679a5031dc51","Type":"ContainerDied","Data":"e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685"} Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.054194 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-969jl" event={"ID":"5356ea20-dbb0-447f-b51e-679a5031dc51","Type":"ContainerDied","Data":"070d6ace3d7ba295379087674f20ef0afaa2fc798e9298de06d9a177c90d1fb8"} Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.054233 4735 scope.go:117] "RemoveContainer" containerID="e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.056259 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerStarted","Data":"486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba"} Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.075221 4735 scope.go:117] "RemoveContainer" containerID="2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.084491 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-969jl"] Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.088266 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-969jl"] Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.105367 4735 scope.go:117] "RemoveContainer" containerID="b4b0a4f8837292b99202e3b4a4176228da15ee0707bc3d4e16b4f6bc02869ffb" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.120770 4735 scope.go:117] "RemoveContainer" containerID="e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685" Dec 09 15:31:36 crc kubenswrapper[4735]: E1209 15:31:36.121610 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685\": container with ID starting with e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685 not found: ID does not exist" containerID="e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.121728 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685"} err="failed to get container status \"e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685\": rpc error: code = NotFound desc = could not find container \"e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685\": container with ID starting with e70d3b9050cb98ec2b0544e90060e7732fba34925498a8ad0f0a299cc8e98685 not found: ID does not exist" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.121820 4735 scope.go:117] "RemoveContainer" containerID="2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4" Dec 09 15:31:36 crc kubenswrapper[4735]: E1209 15:31:36.122398 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4\": container with ID starting with 2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4 not found: ID does not exist" containerID="2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.122437 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4"} err="failed to get container status \"2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4\": rpc error: code = NotFound desc = could not find container \"2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4\": container with ID starting with 2f666b9ef7412b022745a1b61d37ec8044b55e7f1bf22da83f2e0d89a73cefa4 not found: ID does not exist" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.122463 4735 scope.go:117] "RemoveContainer" containerID="b4b0a4f8837292b99202e3b4a4176228da15ee0707bc3d4e16b4f6bc02869ffb" Dec 09 15:31:36 crc kubenswrapper[4735]: E1209 15:31:36.122844 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b0a4f8837292b99202e3b4a4176228da15ee0707bc3d4e16b4f6bc02869ffb\": container with ID starting with b4b0a4f8837292b99202e3b4a4176228da15ee0707bc3d4e16b4f6bc02869ffb not found: ID does not exist" containerID="b4b0a4f8837292b99202e3b4a4176228da15ee0707bc3d4e16b4f6bc02869ffb" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.122927 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b0a4f8837292b99202e3b4a4176228da15ee0707bc3d4e16b4f6bc02869ffb"} err="failed to get container status \"b4b0a4f8837292b99202e3b4a4176228da15ee0707bc3d4e16b4f6bc02869ffb\": rpc error: code = NotFound desc = could not find container \"b4b0a4f8837292b99202e3b4a4176228da15ee0707bc3d4e16b4f6bc02869ffb\": container with ID starting with b4b0a4f8837292b99202e3b4a4176228da15ee0707bc3d4e16b4f6bc02869ffb not found: ID does not exist" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.129940 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-reloader/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.139954 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-frr-files/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.157888 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-reloader/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.169045 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-metrics/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.296271 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-reloader/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.304869 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-frr-files/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.319175 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-metrics/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.342172 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-metrics/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.473859 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-frr-files/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.479749 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-reloader/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.507580 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/cp-metrics/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.510873 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/controller/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.626821 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/frr-metrics/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.665662 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/frr/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.759339 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/kube-rbac-proxy-frr/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.763169 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/kube-rbac-proxy/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.900387 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x68zj_2963bb65-bcd6-473e-a13a-fa1413c7564e/reloader/0.log" Dec 09 15:31:36 crc kubenswrapper[4735]: I1209 15:31:36.931284 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-577d56cdf4-lrqmp_4e80d858-b2b3-4009-b421-f9b227ee3873/manager/0.log" Dec 09 15:31:37 crc kubenswrapper[4735]: I1209 15:31:37.070131 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-86bc98687f-dcjrx_ca532396-6f51-428d-b4ff-ac3bf1920207/webhook-server/0.log" Dec 09 15:31:37 crc kubenswrapper[4735]: I1209 15:31:37.144094 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8tx7p_f8fafb99-3897-41e7-a1b7-aca9fce8ceae/kube-rbac-proxy/0.log" Dec 09 15:31:37 crc kubenswrapper[4735]: I1209 15:31:37.273120 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8tx7p_f8fafb99-3897-41e7-a1b7-aca9fce8ceae/speaker/0.log" Dec 09 15:31:37 crc kubenswrapper[4735]: I1209 15:31:37.421489 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5356ea20-dbb0-447f-b51e-679a5031dc51" path="/var/lib/kubelet/pods/5356ea20-dbb0-447f-b51e-679a5031dc51/volumes" Dec 09 15:31:47 crc kubenswrapper[4735]: I1209 15:31:47.666546 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp_abd436d6-c0b4-4779-a531-98449b2755da/util/0.log" Dec 09 15:31:47 crc kubenswrapper[4735]: I1209 15:31:47.796387 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp_abd436d6-c0b4-4779-a531-98449b2755da/pull/0.log" Dec 09 15:31:47 crc kubenswrapper[4735]: I1209 15:31:47.804463 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp_abd436d6-c0b4-4779-a531-98449b2755da/util/0.log" Dec 09 15:31:47 crc kubenswrapper[4735]: I1209 15:31:47.873366 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp_abd436d6-c0b4-4779-a531-98449b2755da/pull/0.log" Dec 09 15:31:47 crc kubenswrapper[4735]: I1209 15:31:47.970522 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp_abd436d6-c0b4-4779-a531-98449b2755da/pull/0.log" Dec 09 15:31:47 crc kubenswrapper[4735]: I1209 15:31:47.979943 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp_abd436d6-c0b4-4779-a531-98449b2755da/util/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.019425 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8q9cqp_abd436d6-c0b4-4779-a531-98449b2755da/extract/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.115864 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8_991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2/util/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.279059 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8_991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2/pull/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.307953 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8_991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2/pull/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.309129 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8_991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2/util/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.381811 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8_991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2/util/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.424419 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8_991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2/pull/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.450845 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fnmsr8_991fcb59-3ea0-4f22-bfd9-cf5ad15df1a2/extract/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.567232 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr_3a8584cb-b1fa-4d42-b2f8-7b3435a67747/util/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.689114 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr_3a8584cb-b1fa-4d42-b2f8-7b3435a67747/pull/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.691430 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr_3a8584cb-b1fa-4d42-b2f8-7b3435a67747/pull/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.737107 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr_3a8584cb-b1fa-4d42-b2f8-7b3435a67747/util/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.879077 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr_3a8584cb-b1fa-4d42-b2f8-7b3435a67747/util/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.907939 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr_3a8584cb-b1fa-4d42-b2f8-7b3435a67747/extract/0.log" Dec 09 15:31:48 crc kubenswrapper[4735]: I1209 15:31:48.913416 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210qwrpr_3a8584cb-b1fa-4d42-b2f8-7b3435a67747/pull/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.034341 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59_714ef1b2-0a49-4258-aee7-e551f87c0ef4/util/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.185366 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59_714ef1b2-0a49-4258-aee7-e551f87c0ef4/pull/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.196607 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59_714ef1b2-0a49-4258-aee7-e551f87c0ef4/util/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.204424 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59_714ef1b2-0a49-4258-aee7-e551f87c0ef4/pull/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.331104 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59_714ef1b2-0a49-4258-aee7-e551f87c0ef4/util/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.341043 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59_714ef1b2-0a49-4258-aee7-e551f87c0ef4/extract/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.396646 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463f9gg59_714ef1b2-0a49-4258-aee7-e551f87c0ef4/pull/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.514309 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m_bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac/util/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.627626 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m_bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac/util/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.679278 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m_bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac/pull/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.698588 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m_bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac/pull/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.817168 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m_bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac/pull/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.820146 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m_bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac/extract/0.log" Dec 09 15:31:49 crc kubenswrapper[4735]: I1209 15:31:49.841080 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bpb6m_bc34c9f9-fd22-4fc5-ab61-7b400ec0a6ac/util/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.058836 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn6kd_9fd06704-c0be-460b-ad6d-7d976889607e/extract-utilities/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.208238 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn6kd_9fd06704-c0be-460b-ad6d-7d976889607e/extract-utilities/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.220299 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn6kd_9fd06704-c0be-460b-ad6d-7d976889607e/extract-content/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.224879 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn6kd_9fd06704-c0be-460b-ad6d-7d976889607e/extract-content/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.377933 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn6kd_9fd06704-c0be-460b-ad6d-7d976889607e/extract-utilities/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.378158 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn6kd_9fd06704-c0be-460b-ad6d-7d976889607e/extract-content/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.561188 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d2c69_656d05f6-6e95-4f8a-a360-ae278fd02c3b/extract-utilities/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.712688 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sn6kd_9fd06704-c0be-460b-ad6d-7d976889607e/registry-server/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.737144 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d2c69_656d05f6-6e95-4f8a-a360-ae278fd02c3b/extract-content/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.739458 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d2c69_656d05f6-6e95-4f8a-a360-ae278fd02c3b/extract-utilities/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.771751 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d2c69_656d05f6-6e95-4f8a-a360-ae278fd02c3b/extract-content/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.897866 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d2c69_656d05f6-6e95-4f8a-a360-ae278fd02c3b/extract-utilities/0.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.951020 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9d4nc_156f78f9-e75d-4fe3-92b0-e2c29af0728c/marketplace-operator/2.log" Dec 09 15:31:50 crc kubenswrapper[4735]: I1209 15:31:50.953927 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d2c69_656d05f6-6e95-4f8a-a360-ae278fd02c3b/extract-content/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.090389 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9d4nc_156f78f9-e75d-4fe3-92b0-e2c29af0728c/marketplace-operator/1.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.110863 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hh9r9_b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60/extract-utilities/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.137607 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d2c69_656d05f6-6e95-4f8a-a360-ae278fd02c3b/registry-server/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.275554 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hh9r9_b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60/extract-utilities/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.294841 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hh9r9_b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60/extract-content/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.310927 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hh9r9_b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60/extract-content/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.446256 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hh9r9_b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60/extract-content/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.479924 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vzx8d_33564b45-b47d-4cb7-8ff6-fa0226782e59/extract-utilities/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.485269 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hh9r9_b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60/extract-utilities/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.590469 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hh9r9_b45c7f1f-c09b-4d8a-9bfb-b8dec29f1e60/registry-server/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.635430 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vzx8d_33564b45-b47d-4cb7-8ff6-fa0226782e59/extract-utilities/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.668170 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vzx8d_33564b45-b47d-4cb7-8ff6-fa0226782e59/extract-content/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.681395 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vzx8d_33564b45-b47d-4cb7-8ff6-fa0226782e59/extract-content/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.813317 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vzx8d_33564b45-b47d-4cb7-8ff6-fa0226782e59/extract-content/0.log" Dec 09 15:31:51 crc kubenswrapper[4735]: I1209 15:31:51.818819 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vzx8d_33564b45-b47d-4cb7-8ff6-fa0226782e59/extract-utilities/0.log" Dec 09 15:31:52 crc kubenswrapper[4735]: I1209 15:31:52.117784 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vzx8d_33564b45-b47d-4cb7-8ff6-fa0226782e59/registry-server/0.log" Dec 09 15:32:02 crc kubenswrapper[4735]: I1209 15:32:02.014393 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-nqm47_bfe62381-3825-4204-bbf4-8970225de2c4/prometheus-operator/0.log" Dec 09 15:32:02 crc kubenswrapper[4735]: I1209 15:32:02.159019 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68cc4f6974-b8zp8_205460a8-cdae-43e1-8b9c-123f7f4f8c29/prometheus-operator-admission-webhook/0.log" Dec 09 15:32:02 crc kubenswrapper[4735]: I1209 15:32:02.205608 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68cc4f6974-rp6tp_0007731d-b209-44eb-b95a-5b3b95a02ac2/prometheus-operator-admission-webhook/0.log" Dec 09 15:32:02 crc kubenswrapper[4735]: I1209 15:32:02.455795 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-h5bn2_63cae057-68b9-4d57-a64e-fd9314da6cfd/operator/0.log" Dec 09 15:32:02 crc kubenswrapper[4735]: I1209 15:32:02.457665 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-w2xlx_2c5b3de4-5006-4a83-b672-1fa5f2bf2cec/perses-operator/0.log" Dec 09 15:32:05 crc kubenswrapper[4735]: E1209 15:32:05.420768 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:32:05 crc kubenswrapper[4735]: E1209 15:32:05.421224 4735 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" image="38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05" Dec 09 15:32:05 crc kubenswrapper[4735]: E1209 15:32:05.421355 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hxcxg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-operator-index-jzq8f_openstack-operators(9bfb1b1a-87c7-4fa5-ad02-935f53dbc081): ErrImagePull: rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \"http://38.102.83.5:5001/v2/\": dial tcp 38.102.83.5:5001: i/o timeout" logger="UnhandledError" Dec 09 15:32:05 crc kubenswrapper[4735]: E1209 15:32:05.422520 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = DeadlineExceeded desc = initializing source docker://38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05: pinging container registry 38.102.83.5:5001: Get \\\"http://38.102.83.5:5001/v2/\\\": dial tcp 38.102.83.5:5001: i/o timeout\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:32:12 crc kubenswrapper[4735]: I1209 15:32:12.011037 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-ff768b6c6-qngzh_393ef666-9d18-4435-bbc9-76acb5636ce7/kube-rbac-proxy/0.log" Dec 09 15:32:12 crc kubenswrapper[4735]: I1209 15:32:12.024350 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-ff768b6c6-qngzh_393ef666-9d18-4435-bbc9-76acb5636ce7/manager/0.log" Dec 09 15:32:17 crc kubenswrapper[4735]: E1209 15:32:17.420278 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:32:28 crc kubenswrapper[4735]: E1209 15:32:28.416488 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:32:39 crc kubenswrapper[4735]: E1209 15:32:39.415651 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:32:50 crc kubenswrapper[4735]: E1209 15:32:50.415783 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:32:55 crc kubenswrapper[4735]: I1209 15:32:55.502465 4735 generic.go:334] "Generic (PLEG): container finished" podID="dab0fe89-76a6-438b-ab5d-edb1f667e091" containerID="81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4" exitCode=0 Dec 09 15:32:55 crc kubenswrapper[4735]: I1209 15:32:55.502548 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xlskw/must-gather-262dd" event={"ID":"dab0fe89-76a6-438b-ab5d-edb1f667e091","Type":"ContainerDied","Data":"81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4"} Dec 09 15:32:55 crc kubenswrapper[4735]: I1209 15:32:55.503313 4735 scope.go:117] "RemoveContainer" containerID="81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4" Dec 09 15:32:55 crc kubenswrapper[4735]: I1209 15:32:55.880099 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xlskw_must-gather-262dd_dab0fe89-76a6-438b-ab5d-edb1f667e091/gather/0.log" Dec 09 15:33:02 crc kubenswrapper[4735]: I1209 15:33:02.532196 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xlskw/must-gather-262dd"] Dec 09 15:33:02 crc kubenswrapper[4735]: I1209 15:33:02.532863 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-xlskw/must-gather-262dd" podUID="dab0fe89-76a6-438b-ab5d-edb1f667e091" containerName="copy" containerID="cri-o://d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5" gracePeriod=2 Dec 09 15:33:02 crc kubenswrapper[4735]: I1209 15:33:02.538414 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xlskw/must-gather-262dd"] Dec 09 15:33:02 crc kubenswrapper[4735]: I1209 15:33:02.836749 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xlskw_must-gather-262dd_dab0fe89-76a6-438b-ab5d-edb1f667e091/copy/0.log" Dec 09 15:33:02 crc kubenswrapper[4735]: I1209 15:33:02.837290 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xlskw/must-gather-262dd" Dec 09 15:33:02 crc kubenswrapper[4735]: I1209 15:33:02.940842 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe89-76a6-438b-ab5d-edb1f667e091-must-gather-output\") pod \"dab0fe89-76a6-438b-ab5d-edb1f667e091\" (UID: \"dab0fe89-76a6-438b-ab5d-edb1f667e091\") " Dec 09 15:33:02 crc kubenswrapper[4735]: I1209 15:33:02.940939 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6czs\" (UniqueName: \"kubernetes.io/projected/dab0fe89-76a6-438b-ab5d-edb1f667e091-kube-api-access-w6czs\") pod \"dab0fe89-76a6-438b-ab5d-edb1f667e091\" (UID: \"dab0fe89-76a6-438b-ab5d-edb1f667e091\") " Dec 09 15:33:02 crc kubenswrapper[4735]: I1209 15:33:02.945308 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab0fe89-76a6-438b-ab5d-edb1f667e091-kube-api-access-w6czs" (OuterVolumeSpecName: "kube-api-access-w6czs") pod "dab0fe89-76a6-438b-ab5d-edb1f667e091" (UID: "dab0fe89-76a6-438b-ab5d-edb1f667e091"). InnerVolumeSpecName "kube-api-access-w6czs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:33:02 crc kubenswrapper[4735]: I1209 15:33:02.989848 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab0fe89-76a6-438b-ab5d-edb1f667e091-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dab0fe89-76a6-438b-ab5d-edb1f667e091" (UID: "dab0fe89-76a6-438b-ab5d-edb1f667e091"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.042771 4735 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dab0fe89-76a6-438b-ab5d-edb1f667e091-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.042958 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6czs\" (UniqueName: \"kubernetes.io/projected/dab0fe89-76a6-438b-ab5d-edb1f667e091-kube-api-access-w6czs\") on node \"crc\" DevicePath \"\"" Dec 09 15:33:03 crc kubenswrapper[4735]: E1209 15:33:03.414956 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.419898 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab0fe89-76a6-438b-ab5d-edb1f667e091" path="/var/lib/kubelet/pods/dab0fe89-76a6-438b-ab5d-edb1f667e091/volumes" Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.541985 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xlskw_must-gather-262dd_dab0fe89-76a6-438b-ab5d-edb1f667e091/copy/0.log" Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.542318 4735 generic.go:334] "Generic (PLEG): container finished" podID="dab0fe89-76a6-438b-ab5d-edb1f667e091" containerID="d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5" exitCode=143 Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.542359 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xlskw/must-gather-262dd" Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.542363 4735 scope.go:117] "RemoveContainer" containerID="d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5" Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.555173 4735 scope.go:117] "RemoveContainer" containerID="81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4" Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.578300 4735 scope.go:117] "RemoveContainer" containerID="d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5" Dec 09 15:33:03 crc kubenswrapper[4735]: E1209 15:33:03.578824 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5\": container with ID starting with d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5 not found: ID does not exist" containerID="d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5" Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.578861 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5"} err="failed to get container status \"d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5\": rpc error: code = NotFound desc = could not find container \"d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5\": container with ID starting with d04f551881abccd0527303a180d118fbbd29cd31d7833c0fbf00b1ab5a9f82f5 not found: ID does not exist" Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.578885 4735 scope.go:117] "RemoveContainer" containerID="81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4" Dec 09 15:33:03 crc kubenswrapper[4735]: E1209 15:33:03.579174 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4\": container with ID starting with 81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4 not found: ID does not exist" containerID="81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4" Dec 09 15:33:03 crc kubenswrapper[4735]: I1209 15:33:03.579222 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4"} err="failed to get container status \"81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4\": rpc error: code = NotFound desc = could not find container \"81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4\": container with ID starting with 81395ec3caffb3962fd3c2519f3ac1f9a2605be559c20216bcc82080564260a4 not found: ID does not exist" Dec 09 15:33:16 crc kubenswrapper[4735]: E1209 15:33:16.415906 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:33:30 crc kubenswrapper[4735]: E1209 15:33:30.417182 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:33:41 crc kubenswrapper[4735]: E1209 15:33:41.422653 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:33:53 crc kubenswrapper[4735]: E1209 15:33:53.417306 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:34:04 crc kubenswrapper[4735]: I1209 15:34:04.335326 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:34:04 crc kubenswrapper[4735]: I1209 15:34:04.335801 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:34:06 crc kubenswrapper[4735]: E1209 15:34:06.415709 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:34:19 crc kubenswrapper[4735]: E1209 15:34:19.415320 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:34:33 crc kubenswrapper[4735]: E1209 15:34:33.416281 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.335803 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.335854 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.771952 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcxf"] Dec 09 15:34:34 crc kubenswrapper[4735]: E1209 15:34:34.772207 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5356ea20-dbb0-447f-b51e-679a5031dc51" containerName="extract-utilities" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.772219 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5356ea20-dbb0-447f-b51e-679a5031dc51" containerName="extract-utilities" Dec 09 15:34:34 crc kubenswrapper[4735]: E1209 15:34:34.772233 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab0fe89-76a6-438b-ab5d-edb1f667e091" containerName="gather" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.772239 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab0fe89-76a6-438b-ab5d-edb1f667e091" containerName="gather" Dec 09 15:34:34 crc kubenswrapper[4735]: E1209 15:34:34.772247 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5356ea20-dbb0-447f-b51e-679a5031dc51" containerName="registry-server" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.772252 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5356ea20-dbb0-447f-b51e-679a5031dc51" containerName="registry-server" Dec 09 15:34:34 crc kubenswrapper[4735]: E1209 15:34:34.772277 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab0fe89-76a6-438b-ab5d-edb1f667e091" containerName="copy" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.772282 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab0fe89-76a6-438b-ab5d-edb1f667e091" containerName="copy" Dec 09 15:34:34 crc kubenswrapper[4735]: E1209 15:34:34.772295 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5356ea20-dbb0-447f-b51e-679a5031dc51" containerName="extract-content" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.772300 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5356ea20-dbb0-447f-b51e-679a5031dc51" containerName="extract-content" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.772425 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5356ea20-dbb0-447f-b51e-679a5031dc51" containerName="registry-server" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.772435 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab0fe89-76a6-438b-ab5d-edb1f667e091" containerName="copy" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.772444 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab0fe89-76a6-438b-ab5d-edb1f667e091" containerName="gather" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.773337 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.782536 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcxf"] Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.883985 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-catalog-content\") pod \"redhat-marketplace-4rcxf\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.884078 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-utilities\") pod \"redhat-marketplace-4rcxf\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.884104 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msz7f\" (UniqueName: \"kubernetes.io/projected/6607c144-e5d7-452f-9237-66e178e6e6a1-kube-api-access-msz7f\") pod \"redhat-marketplace-4rcxf\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.985886 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-utilities\") pod \"redhat-marketplace-4rcxf\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.985925 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msz7f\" (UniqueName: \"kubernetes.io/projected/6607c144-e5d7-452f-9237-66e178e6e6a1-kube-api-access-msz7f\") pod \"redhat-marketplace-4rcxf\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.985978 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-catalog-content\") pod \"redhat-marketplace-4rcxf\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.986341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-utilities\") pod \"redhat-marketplace-4rcxf\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:34 crc kubenswrapper[4735]: I1209 15:34:34.986392 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-catalog-content\") pod \"redhat-marketplace-4rcxf\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:35 crc kubenswrapper[4735]: I1209 15:34:35.002393 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msz7f\" (UniqueName: \"kubernetes.io/projected/6607c144-e5d7-452f-9237-66e178e6e6a1-kube-api-access-msz7f\") pod \"redhat-marketplace-4rcxf\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:35 crc kubenswrapper[4735]: I1209 15:34:35.087576 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:35 crc kubenswrapper[4735]: I1209 15:34:35.435018 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcxf"] Dec 09 15:34:35 crc kubenswrapper[4735]: W1209 15:34:35.440625 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6607c144_e5d7_452f_9237_66e178e6e6a1.slice/crio-13faa3346c360e24705f0dddbbad6027bce72de412d5e4a72bae0c329564d88f WatchSource:0}: Error finding container 13faa3346c360e24705f0dddbbad6027bce72de412d5e4a72bae0c329564d88f: Status 404 returned error can't find the container with id 13faa3346c360e24705f0dddbbad6027bce72de412d5e4a72bae0c329564d88f Dec 09 15:34:36 crc kubenswrapper[4735]: I1209 15:34:36.089904 4735 generic.go:334] "Generic (PLEG): container finished" podID="6607c144-e5d7-452f-9237-66e178e6e6a1" containerID="d01fe141d56ac5fa8b653544be233e7c3af411cdfbd7b34b8c6953dccdf1c956" exitCode=0 Dec 09 15:34:36 crc kubenswrapper[4735]: I1209 15:34:36.089986 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcxf" event={"ID":"6607c144-e5d7-452f-9237-66e178e6e6a1","Type":"ContainerDied","Data":"d01fe141d56ac5fa8b653544be233e7c3af411cdfbd7b34b8c6953dccdf1c956"} Dec 09 15:34:36 crc kubenswrapper[4735]: I1209 15:34:36.090126 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcxf" event={"ID":"6607c144-e5d7-452f-9237-66e178e6e6a1","Type":"ContainerStarted","Data":"13faa3346c360e24705f0dddbbad6027bce72de412d5e4a72bae0c329564d88f"} Dec 09 15:34:37 crc kubenswrapper[4735]: I1209 15:34:37.097268 4735 generic.go:334] "Generic (PLEG): container finished" podID="6607c144-e5d7-452f-9237-66e178e6e6a1" containerID="6312ebe883f6892bbcc9b0932e308173026824549e3b20ad7949e2888cb99097" exitCode=0 Dec 09 15:34:37 crc kubenswrapper[4735]: I1209 15:34:37.097369 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcxf" event={"ID":"6607c144-e5d7-452f-9237-66e178e6e6a1","Type":"ContainerDied","Data":"6312ebe883f6892bbcc9b0932e308173026824549e3b20ad7949e2888cb99097"} Dec 09 15:34:38 crc kubenswrapper[4735]: I1209 15:34:38.104235 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcxf" event={"ID":"6607c144-e5d7-452f-9237-66e178e6e6a1","Type":"ContainerStarted","Data":"b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0"} Dec 09 15:34:38 crc kubenswrapper[4735]: I1209 15:34:38.120872 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4rcxf" podStartSLOduration=2.671790059 podStartE2EDuration="4.120859322s" podCreationTimestamp="2025-12-09 15:34:34 +0000 UTC" firstStartedPulling="2025-12-09 15:34:36.091030998 +0000 UTC m=+2155.015869626" lastFinishedPulling="2025-12-09 15:34:37.540100262 +0000 UTC m=+2156.464938889" observedRunningTime="2025-12-09 15:34:38.117035293 +0000 UTC m=+2157.041873921" watchObservedRunningTime="2025-12-09 15:34:38.120859322 +0000 UTC m=+2157.045697950" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.159336 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cwh2x"] Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.161591 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.166427 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cwh2x"] Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.252368 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-utilities\") pod \"community-operators-cwh2x\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.252448 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7bhj\" (UniqueName: \"kubernetes.io/projected/2706759a-474d-4f2e-8b36-a8fee5bedc74-kube-api-access-k7bhj\") pod \"community-operators-cwh2x\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.252490 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-catalog-content\") pod \"community-operators-cwh2x\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.353261 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7bhj\" (UniqueName: \"kubernetes.io/projected/2706759a-474d-4f2e-8b36-a8fee5bedc74-kube-api-access-k7bhj\") pod \"community-operators-cwh2x\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.353542 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-catalog-content\") pod \"community-operators-cwh2x\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.353693 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-utilities\") pod \"community-operators-cwh2x\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.353959 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-catalog-content\") pod \"community-operators-cwh2x\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.353994 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-utilities\") pod \"community-operators-cwh2x\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.370697 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7bhj\" (UniqueName: \"kubernetes.io/projected/2706759a-474d-4f2e-8b36-a8fee5bedc74-kube-api-access-k7bhj\") pod \"community-operators-cwh2x\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.476864 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:41 crc kubenswrapper[4735]: I1209 15:34:41.842079 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cwh2x"] Dec 09 15:34:41 crc kubenswrapper[4735]: W1209 15:34:41.845218 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2706759a_474d_4f2e_8b36_a8fee5bedc74.slice/crio-900a922696cb5412c9a8e6b7bbc15e6e6341f14f873b96b2c2038e82a46c8ec3 WatchSource:0}: Error finding container 900a922696cb5412c9a8e6b7bbc15e6e6341f14f873b96b2c2038e82a46c8ec3: Status 404 returned error can't find the container with id 900a922696cb5412c9a8e6b7bbc15e6e6341f14f873b96b2c2038e82a46c8ec3 Dec 09 15:34:42 crc kubenswrapper[4735]: I1209 15:34:42.125102 4735 generic.go:334] "Generic (PLEG): container finished" podID="2706759a-474d-4f2e-8b36-a8fee5bedc74" containerID="260d4bae6ceb52cb2c8cc31c9b057c7e8f20e583478e98e6204f2df9116a1539" exitCode=0 Dec 09 15:34:42 crc kubenswrapper[4735]: I1209 15:34:42.125138 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwh2x" event={"ID":"2706759a-474d-4f2e-8b36-a8fee5bedc74","Type":"ContainerDied","Data":"260d4bae6ceb52cb2c8cc31c9b057c7e8f20e583478e98e6204f2df9116a1539"} Dec 09 15:34:42 crc kubenswrapper[4735]: I1209 15:34:42.125162 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwh2x" event={"ID":"2706759a-474d-4f2e-8b36-a8fee5bedc74","Type":"ContainerStarted","Data":"900a922696cb5412c9a8e6b7bbc15e6e6341f14f873b96b2c2038e82a46c8ec3"} Dec 09 15:34:43 crc kubenswrapper[4735]: I1209 15:34:43.131822 4735 generic.go:334] "Generic (PLEG): container finished" podID="2706759a-474d-4f2e-8b36-a8fee5bedc74" containerID="bb03f869b69a5f4a9095ceb9e948d7790732b82d1d617088686ae26eec39d1e0" exitCode=0 Dec 09 15:34:43 crc kubenswrapper[4735]: I1209 15:34:43.131919 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwh2x" event={"ID":"2706759a-474d-4f2e-8b36-a8fee5bedc74","Type":"ContainerDied","Data":"bb03f869b69a5f4a9095ceb9e948d7790732b82d1d617088686ae26eec39d1e0"} Dec 09 15:34:44 crc kubenswrapper[4735]: I1209 15:34:44.140285 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwh2x" event={"ID":"2706759a-474d-4f2e-8b36-a8fee5bedc74","Type":"ContainerStarted","Data":"41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d"} Dec 09 15:34:44 crc kubenswrapper[4735]: I1209 15:34:44.158089 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cwh2x" podStartSLOduration=1.653817401 podStartE2EDuration="3.158075741s" podCreationTimestamp="2025-12-09 15:34:41 +0000 UTC" firstStartedPulling="2025-12-09 15:34:42.126311346 +0000 UTC m=+2161.051149974" lastFinishedPulling="2025-12-09 15:34:43.630569686 +0000 UTC m=+2162.555408314" observedRunningTime="2025-12-09 15:34:44.153915459 +0000 UTC m=+2163.078754087" watchObservedRunningTime="2025-12-09 15:34:44.158075741 +0000 UTC m=+2163.082914359" Dec 09 15:34:45 crc kubenswrapper[4735]: I1209 15:34:45.088381 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:45 crc kubenswrapper[4735]: I1209 15:34:45.088629 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:45 crc kubenswrapper[4735]: I1209 15:34:45.120002 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:45 crc kubenswrapper[4735]: I1209 15:34:45.172978 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:45 crc kubenswrapper[4735]: E1209 15:34:45.414652 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:34:46 crc kubenswrapper[4735]: I1209 15:34:46.353974 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcxf"] Dec 09 15:34:47 crc kubenswrapper[4735]: I1209 15:34:47.156186 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4rcxf" podUID="6607c144-e5d7-452f-9237-66e178e6e6a1" containerName="registry-server" containerID="cri-o://b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0" gracePeriod=2 Dec 09 15:34:47 crc kubenswrapper[4735]: I1209 15:34:47.472740 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:47 crc kubenswrapper[4735]: I1209 15:34:47.622412 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msz7f\" (UniqueName: \"kubernetes.io/projected/6607c144-e5d7-452f-9237-66e178e6e6a1-kube-api-access-msz7f\") pod \"6607c144-e5d7-452f-9237-66e178e6e6a1\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " Dec 09 15:34:47 crc kubenswrapper[4735]: I1209 15:34:47.622453 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-utilities\") pod \"6607c144-e5d7-452f-9237-66e178e6e6a1\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " Dec 09 15:34:47 crc kubenswrapper[4735]: I1209 15:34:47.622476 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-catalog-content\") pod \"6607c144-e5d7-452f-9237-66e178e6e6a1\" (UID: \"6607c144-e5d7-452f-9237-66e178e6e6a1\") " Dec 09 15:34:47 crc kubenswrapper[4735]: I1209 15:34:47.623628 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-utilities" (OuterVolumeSpecName: "utilities") pod "6607c144-e5d7-452f-9237-66e178e6e6a1" (UID: "6607c144-e5d7-452f-9237-66e178e6e6a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:34:47 crc kubenswrapper[4735]: I1209 15:34:47.626875 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6607c144-e5d7-452f-9237-66e178e6e6a1-kube-api-access-msz7f" (OuterVolumeSpecName: "kube-api-access-msz7f") pod "6607c144-e5d7-452f-9237-66e178e6e6a1" (UID: "6607c144-e5d7-452f-9237-66e178e6e6a1"). InnerVolumeSpecName "kube-api-access-msz7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:34:47 crc kubenswrapper[4735]: I1209 15:34:47.637778 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6607c144-e5d7-452f-9237-66e178e6e6a1" (UID: "6607c144-e5d7-452f-9237-66e178e6e6a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:34:47 crc kubenswrapper[4735]: I1209 15:34:47.723692 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msz7f\" (UniqueName: \"kubernetes.io/projected/6607c144-e5d7-452f-9237-66e178e6e6a1-kube-api-access-msz7f\") on node \"crc\" DevicePath \"\"" Dec 09 15:34:47 crc kubenswrapper[4735]: I1209 15:34:47.723716 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:34:47 crc kubenswrapper[4735]: I1209 15:34:47.723726 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6607c144-e5d7-452f-9237-66e178e6e6a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.164574 4735 generic.go:334] "Generic (PLEG): container finished" podID="6607c144-e5d7-452f-9237-66e178e6e6a1" containerID="b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0" exitCode=0 Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.164615 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcxf" event={"ID":"6607c144-e5d7-452f-9237-66e178e6e6a1","Type":"ContainerDied","Data":"b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0"} Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.164643 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rcxf" event={"ID":"6607c144-e5d7-452f-9237-66e178e6e6a1","Type":"ContainerDied","Data":"13faa3346c360e24705f0dddbbad6027bce72de412d5e4a72bae0c329564d88f"} Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.164647 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rcxf" Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.164670 4735 scope.go:117] "RemoveContainer" containerID="b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0" Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.178036 4735 scope.go:117] "RemoveContainer" containerID="6312ebe883f6892bbcc9b0932e308173026824549e3b20ad7949e2888cb99097" Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.187895 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcxf"] Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.190704 4735 scope.go:117] "RemoveContainer" containerID="d01fe141d56ac5fa8b653544be233e7c3af411cdfbd7b34b8c6953dccdf1c956" Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.191742 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rcxf"] Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.220498 4735 scope.go:117] "RemoveContainer" containerID="b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0" Dec 09 15:34:48 crc kubenswrapper[4735]: E1209 15:34:48.220842 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0\": container with ID starting with b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0 not found: ID does not exist" containerID="b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0" Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.220876 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0"} err="failed to get container status \"b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0\": rpc error: code = NotFound desc = could not find container \"b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0\": container with ID starting with b3cbeb66d9e59aaa245d5b1e5e3b0cd815e84e38f85789ece140187a6180e9c0 not found: ID does not exist" Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.220898 4735 scope.go:117] "RemoveContainer" containerID="6312ebe883f6892bbcc9b0932e308173026824549e3b20ad7949e2888cb99097" Dec 09 15:34:48 crc kubenswrapper[4735]: E1209 15:34:48.221133 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6312ebe883f6892bbcc9b0932e308173026824549e3b20ad7949e2888cb99097\": container with ID starting with 6312ebe883f6892bbcc9b0932e308173026824549e3b20ad7949e2888cb99097 not found: ID does not exist" containerID="6312ebe883f6892bbcc9b0932e308173026824549e3b20ad7949e2888cb99097" Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.221184 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6312ebe883f6892bbcc9b0932e308173026824549e3b20ad7949e2888cb99097"} err="failed to get container status \"6312ebe883f6892bbcc9b0932e308173026824549e3b20ad7949e2888cb99097\": rpc error: code = NotFound desc = could not find container \"6312ebe883f6892bbcc9b0932e308173026824549e3b20ad7949e2888cb99097\": container with ID starting with 6312ebe883f6892bbcc9b0932e308173026824549e3b20ad7949e2888cb99097 not found: ID does not exist" Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.221202 4735 scope.go:117] "RemoveContainer" containerID="d01fe141d56ac5fa8b653544be233e7c3af411cdfbd7b34b8c6953dccdf1c956" Dec 09 15:34:48 crc kubenswrapper[4735]: E1209 15:34:48.221414 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01fe141d56ac5fa8b653544be233e7c3af411cdfbd7b34b8c6953dccdf1c956\": container with ID starting with d01fe141d56ac5fa8b653544be233e7c3af411cdfbd7b34b8c6953dccdf1c956 not found: ID does not exist" containerID="d01fe141d56ac5fa8b653544be233e7c3af411cdfbd7b34b8c6953dccdf1c956" Dec 09 15:34:48 crc kubenswrapper[4735]: I1209 15:34:48.221437 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01fe141d56ac5fa8b653544be233e7c3af411cdfbd7b34b8c6953dccdf1c956"} err="failed to get container status \"d01fe141d56ac5fa8b653544be233e7c3af411cdfbd7b34b8c6953dccdf1c956\": rpc error: code = NotFound desc = could not find container \"d01fe141d56ac5fa8b653544be233e7c3af411cdfbd7b34b8c6953dccdf1c956\": container with ID starting with d01fe141d56ac5fa8b653544be233e7c3af411cdfbd7b34b8c6953dccdf1c956 not found: ID does not exist" Dec 09 15:34:49 crc kubenswrapper[4735]: I1209 15:34:49.420644 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6607c144-e5d7-452f-9237-66e178e6e6a1" path="/var/lib/kubelet/pods/6607c144-e5d7-452f-9237-66e178e6e6a1/volumes" Dec 09 15:34:51 crc kubenswrapper[4735]: I1209 15:34:51.477101 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:51 crc kubenswrapper[4735]: I1209 15:34:51.477298 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:51 crc kubenswrapper[4735]: I1209 15:34:51.507792 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:52 crc kubenswrapper[4735]: I1209 15:34:52.221760 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:52 crc kubenswrapper[4735]: I1209 15:34:52.252224 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cwh2x"] Dec 09 15:34:54 crc kubenswrapper[4735]: I1209 15:34:54.201145 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cwh2x" podUID="2706759a-474d-4f2e-8b36-a8fee5bedc74" containerName="registry-server" containerID="cri-o://41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d" gracePeriod=2 Dec 09 15:34:54 crc kubenswrapper[4735]: I1209 15:34:54.523745 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:54 crc kubenswrapper[4735]: I1209 15:34:54.596908 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7bhj\" (UniqueName: \"kubernetes.io/projected/2706759a-474d-4f2e-8b36-a8fee5bedc74-kube-api-access-k7bhj\") pod \"2706759a-474d-4f2e-8b36-a8fee5bedc74\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " Dec 09 15:34:54 crc kubenswrapper[4735]: I1209 15:34:54.596992 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-catalog-content\") pod \"2706759a-474d-4f2e-8b36-a8fee5bedc74\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " Dec 09 15:34:54 crc kubenswrapper[4735]: I1209 15:34:54.597070 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-utilities\") pod \"2706759a-474d-4f2e-8b36-a8fee5bedc74\" (UID: \"2706759a-474d-4f2e-8b36-a8fee5bedc74\") " Dec 09 15:34:54 crc kubenswrapper[4735]: I1209 15:34:54.597744 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-utilities" (OuterVolumeSpecName: "utilities") pod "2706759a-474d-4f2e-8b36-a8fee5bedc74" (UID: "2706759a-474d-4f2e-8b36-a8fee5bedc74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:34:54 crc kubenswrapper[4735]: I1209 15:34:54.601094 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2706759a-474d-4f2e-8b36-a8fee5bedc74-kube-api-access-k7bhj" (OuterVolumeSpecName: "kube-api-access-k7bhj") pod "2706759a-474d-4f2e-8b36-a8fee5bedc74" (UID: "2706759a-474d-4f2e-8b36-a8fee5bedc74"). InnerVolumeSpecName "kube-api-access-k7bhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:34:54 crc kubenswrapper[4735]: I1209 15:34:54.634686 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2706759a-474d-4f2e-8b36-a8fee5bedc74" (UID: "2706759a-474d-4f2e-8b36-a8fee5bedc74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:34:54 crc kubenswrapper[4735]: I1209 15:34:54.698169 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:34:54 crc kubenswrapper[4735]: I1209 15:34:54.698197 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7bhj\" (UniqueName: \"kubernetes.io/projected/2706759a-474d-4f2e-8b36-a8fee5bedc74-kube-api-access-k7bhj\") on node \"crc\" DevicePath \"\"" Dec 09 15:34:54 crc kubenswrapper[4735]: I1209 15:34:54.698209 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2706759a-474d-4f2e-8b36-a8fee5bedc74-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.208713 4735 generic.go:334] "Generic (PLEG): container finished" podID="2706759a-474d-4f2e-8b36-a8fee5bedc74" containerID="41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d" exitCode=0 Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.208756 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwh2x" event={"ID":"2706759a-474d-4f2e-8b36-a8fee5bedc74","Type":"ContainerDied","Data":"41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d"} Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.208784 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cwh2x" event={"ID":"2706759a-474d-4f2e-8b36-a8fee5bedc74","Type":"ContainerDied","Data":"900a922696cb5412c9a8e6b7bbc15e6e6341f14f873b96b2c2038e82a46c8ec3"} Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.208803 4735 scope.go:117] "RemoveContainer" containerID="41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d" Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.209007 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cwh2x" Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.225780 4735 scope.go:117] "RemoveContainer" containerID="bb03f869b69a5f4a9095ceb9e948d7790732b82d1d617088686ae26eec39d1e0" Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.234701 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cwh2x"] Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.239275 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cwh2x"] Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.266435 4735 scope.go:117] "RemoveContainer" containerID="260d4bae6ceb52cb2c8cc31c9b057c7e8f20e583478e98e6204f2df9116a1539" Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.278583 4735 scope.go:117] "RemoveContainer" containerID="41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d" Dec 09 15:34:55 crc kubenswrapper[4735]: E1209 15:34:55.278839 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d\": container with ID starting with 41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d not found: ID does not exist" containerID="41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d" Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.278868 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d"} err="failed to get container status \"41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d\": rpc error: code = NotFound desc = could not find container \"41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d\": container with ID starting with 41f9d0530cc680425758ef7f76a812a2b9d03f8a855f23443895a83a8617842d not found: ID does not exist" Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.278887 4735 scope.go:117] "RemoveContainer" containerID="bb03f869b69a5f4a9095ceb9e948d7790732b82d1d617088686ae26eec39d1e0" Dec 09 15:34:55 crc kubenswrapper[4735]: E1209 15:34:55.279110 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb03f869b69a5f4a9095ceb9e948d7790732b82d1d617088686ae26eec39d1e0\": container with ID starting with bb03f869b69a5f4a9095ceb9e948d7790732b82d1d617088686ae26eec39d1e0 not found: ID does not exist" containerID="bb03f869b69a5f4a9095ceb9e948d7790732b82d1d617088686ae26eec39d1e0" Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.279144 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb03f869b69a5f4a9095ceb9e948d7790732b82d1d617088686ae26eec39d1e0"} err="failed to get container status \"bb03f869b69a5f4a9095ceb9e948d7790732b82d1d617088686ae26eec39d1e0\": rpc error: code = NotFound desc = could not find container \"bb03f869b69a5f4a9095ceb9e948d7790732b82d1d617088686ae26eec39d1e0\": container with ID starting with bb03f869b69a5f4a9095ceb9e948d7790732b82d1d617088686ae26eec39d1e0 not found: ID does not exist" Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.279165 4735 scope.go:117] "RemoveContainer" containerID="260d4bae6ceb52cb2c8cc31c9b057c7e8f20e583478e98e6204f2df9116a1539" Dec 09 15:34:55 crc kubenswrapper[4735]: E1209 15:34:55.279376 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260d4bae6ceb52cb2c8cc31c9b057c7e8f20e583478e98e6204f2df9116a1539\": container with ID starting with 260d4bae6ceb52cb2c8cc31c9b057c7e8f20e583478e98e6204f2df9116a1539 not found: ID does not exist" containerID="260d4bae6ceb52cb2c8cc31c9b057c7e8f20e583478e98e6204f2df9116a1539" Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.279400 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260d4bae6ceb52cb2c8cc31c9b057c7e8f20e583478e98e6204f2df9116a1539"} err="failed to get container status \"260d4bae6ceb52cb2c8cc31c9b057c7e8f20e583478e98e6204f2df9116a1539\": rpc error: code = NotFound desc = could not find container \"260d4bae6ceb52cb2c8cc31c9b057c7e8f20e583478e98e6204f2df9116a1539\": container with ID starting with 260d4bae6ceb52cb2c8cc31c9b057c7e8f20e583478e98e6204f2df9116a1539 not found: ID does not exist" Dec 09 15:34:55 crc kubenswrapper[4735]: I1209 15:34:55.420746 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2706759a-474d-4f2e-8b36-a8fee5bedc74" path="/var/lib/kubelet/pods/2706759a-474d-4f2e-8b36-a8fee5bedc74/volumes" Dec 09 15:34:58 crc kubenswrapper[4735]: E1209 15:34:58.415653 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.473994 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qphd2"] Dec 09 15:34:58 crc kubenswrapper[4735]: E1209 15:34:58.474298 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6607c144-e5d7-452f-9237-66e178e6e6a1" containerName="registry-server" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.474316 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6607c144-e5d7-452f-9237-66e178e6e6a1" containerName="registry-server" Dec 09 15:34:58 crc kubenswrapper[4735]: E1209 15:34:58.474327 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2706759a-474d-4f2e-8b36-a8fee5bedc74" containerName="registry-server" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.474332 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2706759a-474d-4f2e-8b36-a8fee5bedc74" containerName="registry-server" Dec 09 15:34:58 crc kubenswrapper[4735]: E1209 15:34:58.474346 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2706759a-474d-4f2e-8b36-a8fee5bedc74" containerName="extract-utilities" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.474353 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2706759a-474d-4f2e-8b36-a8fee5bedc74" containerName="extract-utilities" Dec 09 15:34:58 crc kubenswrapper[4735]: E1209 15:34:58.474361 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2706759a-474d-4f2e-8b36-a8fee5bedc74" containerName="extract-content" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.474366 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2706759a-474d-4f2e-8b36-a8fee5bedc74" containerName="extract-content" Dec 09 15:34:58 crc kubenswrapper[4735]: E1209 15:34:58.474376 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6607c144-e5d7-452f-9237-66e178e6e6a1" containerName="extract-utilities" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.474381 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6607c144-e5d7-452f-9237-66e178e6e6a1" containerName="extract-utilities" Dec 09 15:34:58 crc kubenswrapper[4735]: E1209 15:34:58.474394 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6607c144-e5d7-452f-9237-66e178e6e6a1" containerName="extract-content" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.474399 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6607c144-e5d7-452f-9237-66e178e6e6a1" containerName="extract-content" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.474540 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2706759a-474d-4f2e-8b36-a8fee5bedc74" containerName="registry-server" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.474553 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6607c144-e5d7-452f-9237-66e178e6e6a1" containerName="registry-server" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.475571 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.483063 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qphd2"] Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.539888 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a794aad-eaba-40bb-946f-ef44371c8f84-utilities\") pod \"redhat-operators-qphd2\" (UID: \"1a794aad-eaba-40bb-946f-ef44371c8f84\") " pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.539972 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a794aad-eaba-40bb-946f-ef44371c8f84-catalog-content\") pod \"redhat-operators-qphd2\" (UID: \"1a794aad-eaba-40bb-946f-ef44371c8f84\") " pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.540078 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqv5p\" (UniqueName: \"kubernetes.io/projected/1a794aad-eaba-40bb-946f-ef44371c8f84-kube-api-access-wqv5p\") pod \"redhat-operators-qphd2\" (UID: \"1a794aad-eaba-40bb-946f-ef44371c8f84\") " pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.641361 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqv5p\" (UniqueName: \"kubernetes.io/projected/1a794aad-eaba-40bb-946f-ef44371c8f84-kube-api-access-wqv5p\") pod \"redhat-operators-qphd2\" (UID: \"1a794aad-eaba-40bb-946f-ef44371c8f84\") " pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.641527 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a794aad-eaba-40bb-946f-ef44371c8f84-utilities\") pod \"redhat-operators-qphd2\" (UID: \"1a794aad-eaba-40bb-946f-ef44371c8f84\") " pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.641563 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a794aad-eaba-40bb-946f-ef44371c8f84-catalog-content\") pod \"redhat-operators-qphd2\" (UID: \"1a794aad-eaba-40bb-946f-ef44371c8f84\") " pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.641993 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a794aad-eaba-40bb-946f-ef44371c8f84-catalog-content\") pod \"redhat-operators-qphd2\" (UID: \"1a794aad-eaba-40bb-946f-ef44371c8f84\") " pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.642059 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a794aad-eaba-40bb-946f-ef44371c8f84-utilities\") pod \"redhat-operators-qphd2\" (UID: \"1a794aad-eaba-40bb-946f-ef44371c8f84\") " pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.659976 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqv5p\" (UniqueName: \"kubernetes.io/projected/1a794aad-eaba-40bb-946f-ef44371c8f84-kube-api-access-wqv5p\") pod \"redhat-operators-qphd2\" (UID: \"1a794aad-eaba-40bb-946f-ef44371c8f84\") " pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.792766 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:34:58 crc kubenswrapper[4735]: I1209 15:34:58.979039 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qphd2"] Dec 09 15:34:59 crc kubenswrapper[4735]: I1209 15:34:59.231426 4735 generic.go:334] "Generic (PLEG): container finished" podID="1a794aad-eaba-40bb-946f-ef44371c8f84" containerID="c45bee992b0bd74184cc60534f92b7967e3dd91280c9f67b4cd9d0018260b510" exitCode=0 Dec 09 15:34:59 crc kubenswrapper[4735]: I1209 15:34:59.231763 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qphd2" event={"ID":"1a794aad-eaba-40bb-946f-ef44371c8f84","Type":"ContainerDied","Data":"c45bee992b0bd74184cc60534f92b7967e3dd91280c9f67b4cd9d0018260b510"} Dec 09 15:34:59 crc kubenswrapper[4735]: I1209 15:34:59.231790 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qphd2" event={"ID":"1a794aad-eaba-40bb-946f-ef44371c8f84","Type":"ContainerStarted","Data":"4cd0d4b8a967d7ad4d46c7d7a8aa9d2a5951740fc801dcb2c79cd0e4192c8561"} Dec 09 15:35:04 crc kubenswrapper[4735]: I1209 15:35:04.335489 4735 patch_prober.go:28] interesting pod/machine-config-daemon-t5lmh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 09 15:35:04 crc kubenswrapper[4735]: I1209 15:35:04.336282 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 09 15:35:04 crc kubenswrapper[4735]: I1209 15:35:04.336341 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" Dec 09 15:35:04 crc kubenswrapper[4735]: I1209 15:35:04.336796 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba"} pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 09 15:35:04 crc kubenswrapper[4735]: I1209 15:35:04.336845 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerName="machine-config-daemon" containerID="cri-o://486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba" gracePeriod=600 Dec 09 15:35:05 crc kubenswrapper[4735]: I1209 15:35:05.273890 4735 generic.go:334] "Generic (PLEG): container finished" podID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" containerID="486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba" exitCode=0 Dec 09 15:35:05 crc kubenswrapper[4735]: I1209 15:35:05.273930 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" event={"ID":"9700326d-c8d3-42a5-8521-b0fab6ca8ffe","Type":"ContainerDied","Data":"486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba"} Dec 09 15:35:05 crc kubenswrapper[4735]: I1209 15:35:05.273962 4735 scope.go:117] "RemoveContainer" containerID="51d57515a98a21107784fffd63ae292e17e7ff456c3c70324773d956bec8552f" Dec 09 15:35:05 crc kubenswrapper[4735]: E1209 15:35:05.625450 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:35:06 crc kubenswrapper[4735]: I1209 15:35:06.281694 4735 scope.go:117] "RemoveContainer" containerID="486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba" Dec 09 15:35:06 crc kubenswrapper[4735]: E1209 15:35:06.281998 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:35:06 crc kubenswrapper[4735]: I1209 15:35:06.282849 4735 generic.go:334] "Generic (PLEG): container finished" podID="1a794aad-eaba-40bb-946f-ef44371c8f84" containerID="624f721a0ad95f9240609247d179ccaff8706f079c42591e610b4735a9d4f3fa" exitCode=0 Dec 09 15:35:06 crc kubenswrapper[4735]: I1209 15:35:06.282880 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qphd2" event={"ID":"1a794aad-eaba-40bb-946f-ef44371c8f84","Type":"ContainerDied","Data":"624f721a0ad95f9240609247d179ccaff8706f079c42591e610b4735a9d4f3fa"} Dec 09 15:35:06 crc kubenswrapper[4735]: I1209 15:35:06.283979 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 09 15:35:07 crc kubenswrapper[4735]: I1209 15:35:07.291499 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qphd2" event={"ID":"1a794aad-eaba-40bb-946f-ef44371c8f84","Type":"ContainerStarted","Data":"bf00b3d9297dda8db3c6a33117b3521ad9415f37cf1a6c449fd66533256e75e4"} Dec 09 15:35:07 crc kubenswrapper[4735]: I1209 15:35:07.305652 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qphd2" podStartSLOduration=1.745862755 podStartE2EDuration="9.305634001s" podCreationTimestamp="2025-12-09 15:34:58 +0000 UTC" firstStartedPulling="2025-12-09 15:34:59.233173286 +0000 UTC m=+2178.158011914" lastFinishedPulling="2025-12-09 15:35:06.792944532 +0000 UTC m=+2185.717783160" observedRunningTime="2025-12-09 15:35:07.305056737 +0000 UTC m=+2186.229895365" watchObservedRunningTime="2025-12-09 15:35:07.305634001 +0000 UTC m=+2186.230472630" Dec 09 15:35:08 crc kubenswrapper[4735]: I1209 15:35:08.793439 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:35:08 crc kubenswrapper[4735]: I1209 15:35:08.794383 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:35:09 crc kubenswrapper[4735]: I1209 15:35:09.828130 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qphd2" podUID="1a794aad-eaba-40bb-946f-ef44371c8f84" containerName="registry-server" probeResult="failure" output=< Dec 09 15:35:09 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Dec 09 15:35:09 crc kubenswrapper[4735]: > Dec 09 15:35:13 crc kubenswrapper[4735]: E1209 15:35:13.415834 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:35:18 crc kubenswrapper[4735]: I1209 15:35:18.837462 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:35:18 crc kubenswrapper[4735]: I1209 15:35:18.878289 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qphd2" Dec 09 15:35:18 crc kubenswrapper[4735]: I1209 15:35:18.929477 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qphd2"] Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.074352 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vzx8d"] Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.074971 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vzx8d" podUID="33564b45-b47d-4cb7-8ff6-fa0226782e59" containerName="registry-server" containerID="cri-o://655dc98f098b95b2a374222184d565e99cdcddc9380a354d96df60bb040d058b" gracePeriod=2 Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.388145 4735 generic.go:334] "Generic (PLEG): container finished" podID="33564b45-b47d-4cb7-8ff6-fa0226782e59" containerID="655dc98f098b95b2a374222184d565e99cdcddc9380a354d96df60bb040d058b" exitCode=0 Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.388207 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzx8d" event={"ID":"33564b45-b47d-4cb7-8ff6-fa0226782e59","Type":"ContainerDied","Data":"655dc98f098b95b2a374222184d565e99cdcddc9380a354d96df60bb040d058b"} Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.458206 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.640792 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-catalog-content\") pod \"33564b45-b47d-4cb7-8ff6-fa0226782e59\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.641001 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vktp\" (UniqueName: \"kubernetes.io/projected/33564b45-b47d-4cb7-8ff6-fa0226782e59-kube-api-access-8vktp\") pod \"33564b45-b47d-4cb7-8ff6-fa0226782e59\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.641419 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-utilities\") pod \"33564b45-b47d-4cb7-8ff6-fa0226782e59\" (UID: \"33564b45-b47d-4cb7-8ff6-fa0226782e59\") " Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.642129 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-utilities" (OuterVolumeSpecName: "utilities") pod "33564b45-b47d-4cb7-8ff6-fa0226782e59" (UID: "33564b45-b47d-4cb7-8ff6-fa0226782e59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.653653 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33564b45-b47d-4cb7-8ff6-fa0226782e59-kube-api-access-8vktp" (OuterVolumeSpecName: "kube-api-access-8vktp") pod "33564b45-b47d-4cb7-8ff6-fa0226782e59" (UID: "33564b45-b47d-4cb7-8ff6-fa0226782e59"). InnerVolumeSpecName "kube-api-access-8vktp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.731431 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33564b45-b47d-4cb7-8ff6-fa0226782e59" (UID: "33564b45-b47d-4cb7-8ff6-fa0226782e59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.743560 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.743589 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vktp\" (UniqueName: \"kubernetes.io/projected/33564b45-b47d-4cb7-8ff6-fa0226782e59-kube-api-access-8vktp\") on node \"crc\" DevicePath \"\"" Dec 09 15:35:19 crc kubenswrapper[4735]: I1209 15:35:19.743609 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33564b45-b47d-4cb7-8ff6-fa0226782e59-utilities\") on node \"crc\" DevicePath \"\"" Dec 09 15:35:20 crc kubenswrapper[4735]: I1209 15:35:20.410255 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzx8d" Dec 09 15:35:20 crc kubenswrapper[4735]: I1209 15:35:20.412643 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzx8d" event={"ID":"33564b45-b47d-4cb7-8ff6-fa0226782e59","Type":"ContainerDied","Data":"5967e861fd31f4b3a5aef9d7721d54c64f6750913ec3ffb4d224f473916d76f3"} Dec 09 15:35:20 crc kubenswrapper[4735]: I1209 15:35:20.412775 4735 scope.go:117] "RemoveContainer" containerID="655dc98f098b95b2a374222184d565e99cdcddc9380a354d96df60bb040d058b" Dec 09 15:35:20 crc kubenswrapper[4735]: I1209 15:35:20.414388 4735 scope.go:117] "RemoveContainer" containerID="486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba" Dec 09 15:35:20 crc kubenswrapper[4735]: E1209 15:35:20.414831 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:35:20 crc kubenswrapper[4735]: I1209 15:35:20.446429 4735 scope.go:117] "RemoveContainer" containerID="b0c41cf55954787b3a7ee4ef5eb62c8a3635c79be3dc3a03d65730f38beef7dd" Dec 09 15:35:20 crc kubenswrapper[4735]: I1209 15:35:20.449912 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vzx8d"] Dec 09 15:35:20 crc kubenswrapper[4735]: I1209 15:35:20.455578 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vzx8d"] Dec 09 15:35:20 crc kubenswrapper[4735]: I1209 15:35:20.467604 4735 scope.go:117] "RemoveContainer" containerID="8f060fae2d793e26a91a68951aa8f9914c3e7fbb43b679be84d28f8a8e4d2c60" Dec 09 15:35:21 crc kubenswrapper[4735]: I1209 15:35:21.424716 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33564b45-b47d-4cb7-8ff6-fa0226782e59" path="/var/lib/kubelet/pods/33564b45-b47d-4cb7-8ff6-fa0226782e59/volumes" Dec 09 15:35:26 crc kubenswrapper[4735]: E1209 15:35:26.416992 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:35:35 crc kubenswrapper[4735]: I1209 15:35:35.415401 4735 scope.go:117] "RemoveContainer" containerID="486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba" Dec 09 15:35:35 crc kubenswrapper[4735]: E1209 15:35:35.416229 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:35:37 crc kubenswrapper[4735]: E1209 15:35:37.415719 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:35:49 crc kubenswrapper[4735]: I1209 15:35:49.414817 4735 scope.go:117] "RemoveContainer" containerID="486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba" Dec 09 15:35:49 crc kubenswrapper[4735]: E1209 15:35:49.415750 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:35:49 crc kubenswrapper[4735]: E1209 15:35:49.416676 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:36:00 crc kubenswrapper[4735]: I1209 15:36:00.414485 4735 scope.go:117] "RemoveContainer" containerID="486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba" Dec 09 15:36:00 crc kubenswrapper[4735]: E1209 15:36:00.415269 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:36:03 crc kubenswrapper[4735]: E1209 15:36:03.417250 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:36:14 crc kubenswrapper[4735]: I1209 15:36:14.415101 4735 scope.go:117] "RemoveContainer" containerID="486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba" Dec 09 15:36:14 crc kubenswrapper[4735]: E1209 15:36:14.416231 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:36:17 crc kubenswrapper[4735]: E1209 15:36:17.416764 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081" Dec 09 15:36:26 crc kubenswrapper[4735]: I1209 15:36:26.414018 4735 scope.go:117] "RemoveContainer" containerID="486783668ece5c5d7530e591dd1d24c49280a7af2605ab926c88af7068f0b6ba" Dec 09 15:36:26 crc kubenswrapper[4735]: E1209 15:36:26.414915 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t5lmh_openshift-machine-config-operator(9700326d-c8d3-42a5-8521-b0fab6ca8ffe)\"" pod="openshift-machine-config-operator/machine-config-daemon-t5lmh" podUID="9700326d-c8d3-42a5-8521-b0fab6ca8ffe" Dec 09 15:36:28 crc kubenswrapper[4735]: E1209 15:36:28.416390 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/openstack-operator-index:b7b08af6c20d940e7a79480f938b88859bf69d05\\\"\"" pod="openstack-operators/openstack-operator-index-jzq8f" podUID="9bfb1b1a-87c7-4fa5-ad02-935f53dbc081"